What is Good Performance for SAP system? (1)

  • Data archiving can play a critical role in helping businesses mana Data archiving can play a critical role in helping businesses manage their data growth. This is especially true if it is implemented as early as possible. Data archiving is a flexible and reliable online process that deals with businesscomplete data, meaning data that is no longer needed in everyday
  • During the first step, data is written to archive files in a compressed format. In this stage the program runs archivability checks to ensure that only business-complete data will be archived and that the data to be archived is no longer needed in daily business processes. During the second step, the program compares the database with the archive file and deletes only the information from the database that has been written to the archive file. Then the data may be transferred to a storage system – such as a content server attached to the system via the ArchiveLink interface – for long-term archiving 


What is Good Performance for SAP system? (2)  


  • So, even if a user restricts the data selection in an exemplary fashion, it cannot be avoided that old data from the index be read into the random access memory (RAM) – of course, only if the data block transferred contains a mixture of new and old data. But how can this happen? Well, basically, a mixture of new and old data in the database tables can occur whenever data is deleted from the database. When new data is added to the database chances are high that it will be inserted into these "gaps". This also applies to indexes, but there is one peculiarity about indexes that we must know. Index entries can be generally sorted in two different ways: either chronologically (organized in reference to a time factor) or non-chronologically (organized in reference to a factor other than time, e.g. GUIDs).
  • In chronologically sorted indexes, which also includes indexes organized by document numbers (provided they ascend with time), old and new data never resides in close proximity to each other, making it virtually impossible that a "contaminated" data block with new and old data be read into the memory.
  • With non-chronologically sorted indexes, however, things are different. As such an index does not ascend over time, new data will be inserted in locations distributed across the entire width of the index – wherever there is a gap that is eligible for data insertion. This ends up in many data blocks containing a mixture of new and old data. When such a data block is read into the memory the buffer quality for query accesses goes down, leading to a considerable decrease in performance.


  • SAP archived data will be managed and stored in storage system
  • Reduction of necessary disk space
  • 1GB of disk space can cost between 1,000 - 5,000 Euros per year. This includes hardware, software, maintenance, and human resources related to storage and storage management. With implementation of the recommendations = reduction of data base growth.
  • SAP data archiving should be considered
    1. if database size is greater than 200 GB or the database grows at a rate greater than 10 GB per month
    2. A large or rapidly growing database has been found during the previous delivery of Safeguarding services such as:
    • SAP EarlyWatch Alert
    • SAP EarlyWatch Check
    • SAP GoingLive Check
    • SAP Solution Management Assessment


  • ADK – Archive Development Kit is the SAP standard interface between SAP data archived file system and storage system
  • WebDAV, which is an extension to the HTTP protocol, stands for "Web Distributed Authoring and Versioning" and allows clients to perform remote Web content authoring operations. In addition to these authoring features WebDAV is particularly valuable for its state-of-the-art communication protocol, which addresses several limitations of HTTP by providing capabilities,for example, for creating hierarchies (collections) and managing metadata (properties).

 Christian McCaffrey Authentic Jersey