6 Simple steps to integrate source data into an EPM solution

by | Dec 17, 2021


Introduction 

The data flow from a source system into an Enterprise Performance Management (EPM) system is always a vital component of any successful implementation project. Understanding the required output of an EPM system will drive the Extract, Transform and Load (ETL) design process that the source data must go through. In turn, this will drive the way the project team surfaces the data to the organisation’s EPM system.

Challenge description 

The process of surfacing data consumed and transformed by the organisation’s EPM application is often confusing and complex as there are many factors to consider throughout the integration.

Why is this a problem?  

When designing an EPM system, the result or desired output that we expect is often the focus of a consulting team, with the “nuts and bolts” of how we get there then falling onto the build team. A vital element of any implementation process is understanding how we will transform the data to what is required and where we will get the data.

What is the solution model 

Start with the end in mind: Integration teams must understand the type of data the organisation’s EPM application will consume, the expected output reporting from the application, and how they should transform the data throughout the process. The teams must then plan the process accordingly and communicate with the organisation’s subject matter resources for any special requirements, including system access, data understanding, and business rules to embed in the transformation process.

Six steps in our solution model: 

  • Start with the end in mind – know the key elements of the EPM expected end output
  • Determine the source systems and all the complexities of extracting that data.
  • Request relevant access to all source systems for the integration team and the designated connection user.
  • Prepare data for EPM consumption – prepare the data to the required format from the source if possible. The more formatted the data is by the time it gets to the EPM, the simpler the mapping and transformation will be.
  • Test the data – test the ETL process at each stage of the transformation, and ensure all scenarios are covered.
  • Document the entire ETL process.

Key Considerations

ETL design 

Once the solution model is in place, integration teams must consider the split between data manually entered into the EPM application and imported data. Again, best practice dictates to automate as much of it as possible.

Five tips for not wasting time in the ETL: 

  • Start by getting timely security access when the integration team starts.
  • Get assistance from a client subject matter resource to understand the source data.
  • The size (volume) of expected integration data must be understood, including the data timing summarisation.
  • Work closely with the integration and organisation teams to understand the data that needs to come through, especially the detail.
  • What are the organisation’s most significant concerns regarding security; do the integration team control the script or the DBAs.

Understand the data imported into the EPM application

The most common integrated data is Financial Data, which typically starts as a trial balance import. This, fortunately, is not a complex extraction, and anyone with an intermediate understanding of SQL (in the case of a database) can perform it.

Sub ledger data such as sales information that requires linking to a GL ledger account may require a more intimate knowledge of the source system database tables.

Once the integration team has established which data they should import into the system, they need to ask two key questions:

  1. What is the source of this data?
    • Database
    • Flat file
    • Excel-based
  1. How accessible is it?
    • Database
      – Is it on the same network as the EPM system?
      – Would the integration team have access to it?
      – Would the integration team need additional security privileges to access the data such as a VPN and API?

       

    • Flat File or Excel document
      – Is the file accessible from the network the integration team is on?
      – Is the file consistently produced with the same formatting?
      – Is the file format fit for importing into the EPM application?

Once the security team has established connectivity to the source system, the integration team needs to format and prepare the data before integrating it into the application.

For data imported from a database, the integration team must balance the volume of summarised versus detailed data. For example, if they query a detailed General Ledger table with a million lines, it makes sense to group the data to the lowest denominator with all the required fields rather than integrating every transaction. Therefore, the EPM application should have drill-down functionality that allows integrated detailed data to be accessed on demand.

When extracting financial data, the integration team should know the required parameters for each entity for which one is sourcing data.

One of the most overlooked parameters is financial periods, especially their start and end course adjustment periods. Unfortunately, adjustment periods are often not mentioned as a parameter, as a client often only thinks about them at Year-End.

Finally, the integration team prepares the data query as a stored procedure in the source database or in the EPM platform as a SQL query to be readily consumed by the EPM application.

 

Stored procedure or query 

The EPM platform would then pass parameters to the source database stored procedure to fetch the data, or in some cases, would pass a query to the source database in a select statement.

The method of fetching the data from the source system may be of consequence to the organisation, especially if the application will fetch data over the internet. For example, security practitioners regard passing parameters to the source system as safer than passing an entire script, as this may have information relating to a specific database and table.

Creating a Stored Procedure would also require the integration team to involve the organisation’s database team to assist in the process, which can burden the timelines and efficiency of the implementation project.

Alternatively, if the integration team writes the query in an EPM business rule, they can directly control the script and change it when needed, which adds to the project’s flexibility and efficiency.

Flat file and Excel file  

For data integrated from a Flat File, formatting consistency of the Source File is a crucial requirement, as the integration team must assure the organisation that it is a trusted and consistent source to import.

Mapping data to the organisation’s EPM dimensions 

Once the organisation has its data in the required format from the source, the integration team can start the mapping process to integrate the data into a staging area. The mapping process entails mapping source fields to EPM dimensions. The organisation validates data as relevant to the EPM tool during this state.

Flat files require a particular setup for integration into the EPM tool. First, the integration team needs to identify columns, the data length in those columns, and row indicators. Once that is set up, the data import can then populate the staging area.

Data mapping of data integrated from a database is a more straightforward process than for flat files, as a lot of the work has already gone into designing the query or stored procedure.

Transformation of data 

Once the data has populated the staging tables within the EPM application, the transformation can occur. Here the system validates the critical source data columns and checks them against the setup rules that dictate how that data must appear in the EPM application.

At this stage, the data falls into the responsibility of the EPM specialist and the design of the system.

The impediments and pitfalls to avoid: 

  1. Connectivity access.
    One of the most time-consuming aspects of integration is always access to the source system. In general, any time the organisation includes an extra party to the integration process outside the integration team, the organisation must acknowledge that there is always an element of inefficiency.

The size of the organisation’s source data may lead to extended times required to fetch data, so timeout settings need to be adjusted to accommodate that.

 

2. Data preparation.
If working on a data source where the integration team does not know what the tables are that would give them the required data, the organisation may need to request a third party or internal database administrator (DBA) that knows the tables to assist with the data extraction query. This may also include involving the ERP specialist who supports the organisation’s system in data preparation.

 

  1. Adjustment periods.
    One of the most overlooked requirements is how to deal with adjustment periods, typically a period 13 and 14, to accommodate adjustments made at year-end from auditor instructions or corrections.

Conclusion

Data integration forms a critical component for any system implementation, and EPM systems are no different. In our experience, the data-sensitive nature of EPM systems requires integration specialists to plan holistically for the complete integration and testing cycle. Over the years, our observations in this area have enabled us to deliver complex implementations much more efficiently since we continue to build our learnings into our iterative improvements.

 

Jigsaw Advisory specialises in guiding CFOs and finance teams through their digital transformation journey.  We value and nurture our niche position as a trusted transformation advisor to many top-level CFOs and are so confident about our recommendations that we choose to implement the solutions ourselves.  That way, we fully understand the business context for the challenge and use that to serve as the focus throughout the implementation and benefits realisation process. Get in touch with our senior team here.