Skip to Content
Businesswoman presenting Benchmarks on graphical screens

SAP Standard Application Benchmarks

SAP Standard Application Benchmarks help customers and partners find the appropriate hardware configuration for their IT solutions. Working in concert, SAP and our hardware partners developed the SAP Standard Application Benchmarks to test the hardware and database performance of SAP applications and components.

Benefits of Benchmarking

SAP Standard Application Benchmarks help customers and partners find the appropriate hardware configuration for their IT solutions. Working in concert, SAP and our hardware partners developed SAP Standard Application Benchmarks to test the hardware and database performance of SAP applications and components.

Released for technology partners, benchmarks provide basic sizing recommendations to customers by placing a substantial load upon a system during the testing of new hardware, system software components, and relational database management systems (RDBMS). All performance data relevant to system, user, and business applications are monitored during a benchmark run and can be used to compare platforms.

Customers can benefit from the benchmark results in various ways. For example, benchmark results illuminate the scalability and manageability of large installations. The benchmark results:

  • Allow users to compare different platforms)
  • Enable Proof-of-concepts scenarios)
  • Provide an outlook for future performance levels (new platforms, new servers, and so on)
  • Provide basic information to configure and size SAP Business Suite

In general, SAP's technology partners – particularly hardware vendors – run the benchmarks, testing different business application scenarios on a specific hardware. The primary objective is to use the results to determine an optimal hardware configuration for a customer system. In addition, the marketing departments use benchmark results to support sales.

For SAP, benchmarks also represent an excellent opportunity for quality assurance. For example, benchmarks are run internally to monitor resource consumption when a new release is developed. Also, even though they cannot reflect customer environments, benchmarks analyze system configurations and parameter settings.

Running a SAP Standard Application Benchmark

During the benchmark run, all relevant performance parameters of an SAP system are monitored, for example, CPU utilization, memory consumption, the I/O system, network load, as well as functional errors and system availability. As a result, "special tunings" to achieve better results are impossible.

The output files are thoroughly analyzed for any divergence from the expected behavior. These technology checks result in a benchmark certification. The benchmark run takes at least 15 minutes. The throughput results are then extrapolated to one hour. Each SAP Standard Application Benchmark consists of a number of script files that simulate typical and popular transactions and workflow in a particular business scenario. The benchmark also has a predefined SAP client database that contains sample company data against which the benchmark is run.

Benchmark Characteristics

Customers can benefit from the benchmark results in various ways. For example, benchmark results illuminate the scalability and manageability of large installations. The benchmark results:

  • Allow users to compare different platforms)
  • Enable Proof-of-concepts scenarios)
  • Provide an outlook for future performance levels (new platforms, new servers, and so on)
  • Provide basic information to configure and size SAP Business Suite

Possible Configurations

In general, SAP's technology partners – particularly hardware vendors – run the benchmarks, testing different business application scenarios on a specific hardware. The primary objective is to use the results to determine an optimal hardware configuration for a customer system. In addition, the marketing departments use benchmark results to support sales.

SAP Benchmark Council

Established in 1995, the SAP Benchmark Council monitors all benchmarking activities and provides standards for publishing benchmark results in the SAP environment. Representatives of SAP and technology partners, such as hardware and database vendors involved in benchmarking, make up the SAP Benchmark Council.

On behalf of the SAP Benchmark Council, SAP defines and controls the content of the benchmarks and establishes rules that determine the testing procedure during benchmarking. Technology partners submit the benchmark results to SAP for certification. There are a number of rules and guidelines that must be followed to obtain a certification. The certificate is the basis for any benchmark publication by the technology partners.
Some of the benchmark rules include:

  • The average dialog response time must be below one second
  • Each Benchmark User must have a fixed think time of ten seconds
  • The tested hardware and the release combination of operating system, database, and SAP releases must be generally available for customers within six months after the certification of the benchmark
  • The following SAP software settings can be changed during benchmarking:
    • The number of instances
    • The number of dialog work processes
    • The number of update processes
    • The size of SAP buffers

In general, only those tuning features are allowed that can also be used and applied in a productive customer installation.

Publications, Policy and Violations

Read the SAP Standard Application Benchmark Publication Policy to learn about practices for the publication of information related to SAP Standard Application Benchmarks.

Publication violations recognized by the SAP Benchmark Council workgroup are listed in the table below. The table includes a high-level description of the violation, corrections suggested by the workgroup, and reasons for accepting the ruling.

Date of Workgroup Decision Violating Company Name

Description of the Violations

Suggested Correction Actions

Clarification of the Workgroup

09/24/2015 Oracle Starting September 17, 2015, Oracle violated the benchmark publication guidelines in an advertisement campaign on their Web site and in a published white paper. 4 BW-EML Benchmarks performed by Oracle which were not officially certified were referenced and compared to other Benchmark results. Oracle also copied the table structure and format of the benchmark results tables on sap.com/benchmark to give customer the impression that their benchmark results were officially certified. Remove all information related to benchmarks that are not officially certified. Remove the white paper from the web. Stop the marketing campaign using none certified benchmark results immediately. Refer to section 4.2.2. of the benchmark publication rules(4.2.2. Any publication may only include numbers that refer to published benchmark results).
5/5/2011 IBM In April, IBM violated the benchmark publication guidelines in an advertisement on their Web site. Price/performance statements were made with reference to SAP SD benchmark results Explicitly state that only TPC results are used for price/performance, not SAP. Refer to section 4.2.8 of the benchmark publication policy.
8/4/2010 IBM In February, IBM violated the benchmark publication guidelines in a presentation on their web site. Price/performance statements were made and the minimum data concerning the benchmarks mentioned was not provided. Remove the information related to price/performance and add the required minimum data. Refer to sections 1 and 4.2.8 of the benchmark publication policy.
7/3/2008 IBM In May, IBM violated the benchmark publication rules by including performance per watt information in conjunction with benchmark results on their public Web site. Remove the information related to power consumption from the web page Refer to section 4.2.2 of the benchmark publication policy.
6/5/2008 IBM In May, IBM published a presentation at an international event which violated the benchmark publication guidelines. Incorrect leadership statements were made and the minimum data concerning the benchmarks mentioned was not complete. Change the presentation Refer to chapter 1 and sections 4.1, 4.2 and 4.3 of the benchmark publication policy.
11/2/2006 IBM In October, IBM published a press release in which it violated the benchmark publication guidelines concerning the minimum data required. Change or remove the Web page. Refer to chapter 1 of the benchmark publication policy.
3/2/2006 HP In January, HP violated the benchmark publication by including pricing information in conjunction with a benchmark result on their public Web site. Change the Web site. Refer to sections 4.1 and 4.2 of the benchmark publication policy.
12/8/2005 Microsoft In November, Microsoft published a press release in which it violated the benchmark publication guidelines concerning the minimum data required. Change press release on the Web site. Refer to section 1.1 and 1.2 of the benchmark publication policy.
3/4/2004 IBM In February, IBM published a press release violating the fence claim rules. Change press release on the Web or remove it. Refer to section 4.11 of the benchmark publication policy.
6/11/2003 Oracle In May 2003, Oracle published benchmark information on a web page that contained misleading information, such as the new definition of benchmark categories and record results in these. The minimum data for comparing benchmark results was not included. Change the Web site. Refer to sections 4.1, 4.2 and 4.6 of the benchmark publication policy.
3/6/2003 IBM On February 7, 2003, IBM published a press release that included pricing information about the benchmarked system. Change the Web site. E209Section 4.8 (document version 2.3) of the publication policy stipulates that pricing information should not be included in a press release
SAP should not have approved of the press release
6/12/2002 IBM In May 2002, IBM did not include the minimum data for two benchmark results in an advertising campaign. Since the running campaign cannot be stopped, include minimum data in new advertisements. Refer to section 1 of the publication policy.
2/7/2002 Oracle In December 2001 and January 2002, three Oracle publications were found to be violating the publication guidelines by comparisons of different tests and misleading benchmark information. Since the running campaign cannot be stopped, include minimum data in new advertisements Refer to sections 4.1, 4.2, and 4.3 of the publication policy.
7/12/2001 Oracle In June 2001, a non-existing "Standard SAP SD Light Benchmark" was mentioned in a public presentation on Oracle's homepage. Change the Web site. There is no "Standard SAP SD Light Benchmark".
7/12/2001 Oracle In June 2001, Oracle compared two SAP Standard Application Benchmark results in a press release without stating the appropriate publication details. Change press release on the Web. Refer to section 1 of the publication policy.
7/12/2001 Oracle In June 2001, Oracle did not include the minimum data for two benchmark results in an advertising campaign. Since the running campaign cannot be stopped, include minimum data in new advertisements. Refer to section 1 of the publication policy.
Back to top