Function Points FAQs

Applications Based Upon Software Packages

Home

Issue Description

Functional Overview

General Discussion and Resolution

Transaction Functions

Classification of Transactional Functions


ISSUE

How should the functional size of applications and projects based upon package software be determined?

How do the methods used to functionally size such applications and projects differ from those used to size custom developed applications and projects?

Home


FUNCTIONAL OVERVIEW

Increasingly, organisations are purchasing packaged software in order to deliver the functional requirements of their software applications. The functional size of the software packages can vary from small, specialised packages (<200 function points) to extremely large, generalised software packages (>20,000 function points). It is the large software packages that provide the greatest challenge for FP Analysts.

Packaged software rarely provides a perfect match for the customer's functional requirements. The organisation must either re-engineer their business processes to match the package functionality, or customise the package functions to match their business requirements. This customisation may involve:

  1. Tailoring existing package functions via package supplied, customer defined tables
  2. Developing subroutines and common use modules which are accessed via package supplied 'user exits'
  3. Altering package supplied source code
  4. Developing new functions, typically reports and application interfaces, which are integrated into the production system.

The software package may also deliver considerably more functionality than is required by the customer. In conducting the function point count the FP Analyst must determine which of the packages functions have been implemented, and which of the implemented functions are actually used. For example, if package functionality is implemented by module, particular modules may include functions that will not be used.

Measuring the functional size of applications based upon large software packages is a major task due to the amount of functionality delivered by the standard application suite and because of the difficulties associated with sizing external vendor supplied packaged software. Of particular concern to FPA are the:

  • Sheer size of large packaged applications, for example the SAP based applications.
  • Lack of system documentation for 'vanilla' package functions.
  • Lack of a logical representation of data maintained by the packaged system
  • Significant amounts of reused functions that appear in multiple package modules.
  • Heavy reliance on application experts to identify implemented functionality
  • Lack of access to package application experts.

Commonly encountered examples of applications based upon packaged software are the SAP (The SAP (Systems, Applications and Products) suite of applications, developed by the German company AG and the associated tools developed by SAP partners such as Microsoft are used widely within many industries.) based applications. Many of the examples that follow are based upon SAP.

Home


GENERAL DISCUSSION AND RESOLUTION

Applications based upon packaged software can be counted in the same way as applications built in-house, they just require more effort in the modelling stages of the count as packaged software is not usually delivered with system information that supports taking a logical business view of system functions. Rather, the system documentation describes the physical implementation of screens and data tables. Care must be taken to count the logical business view not the physical implementation.

The planning phase of the function point count is critical to achieving a successful outcome and should address the following issues:

Count Purpose

The purpose for counting must be very clearly established at the outset as this determines the scope of the function point counting activity. Examples of count purposes associated with packaged software may include:

  1. Application Baseline Count
  • To determine the % of package functionality implemented by the organisation
  • To evaluate the degree of match between the customer's functional requirements and the functionality provided by the package
  • To determine the implemented size of the application for application portfolio reporting
  1. Development or Enhancement Projects
  • To determine the functional size of the customisation project only
  • To assist in project estimation
  • To determine project productivity rates for delivery of customised vs non-customised functions
  • To functionally size the interfacing functions required to incorporate the package into the organisations existing systems architecture

Only functions which are relevant to the purpose for counting should be included within the count scope.

Application Boundary

Large software packages tend to incorporate modules which, in a normal systems environment, would constitute multiple stand-alone applications. Software packages are usually required to interface to existing systems within the organisation's software portfolio. It is also possible that, within the one organisation multiple applications may be based upon the same software packages.

The guidelines detailed in Section 3.1.2 Positioning of Application Boundaries apply equally to custom-developed software and applications based upon packaged software.

Source Information

One of the major difficulties with functionally sizing applications and projects based upon packaged software is the lack of source information on which to base the count. Information sources include:

  1. Application Boundary Diagram - Normally available, as the organisation is integrating the package application into their existing systems environment.
  2. Functional Decomposition/Menu Access Paths - Some packages provide a tool which permits the printing of menu structures for on-line system functions. For example SAP Dynamic Menu tool
  3. Logical Data Model - Rarely available with packaged software
  4. Physical File/Table Lists - More likely to be available than logical data models. Difficulties encountered in identifying which tables represent logical files and which are actually used in the application being sized.
  5. Lists of Batch Functions - Rarely available as the package batch functionality is considered to be background processing. Application users have very little visibility of the batch functions. Specifications for customised batch functions should be available.
  6. Screen and Report Lists - Should be available.
  7. Interface Specifications - Normally available as this is usually custom built functionality.

Prior to commencing the function point count, the availability of the above system documentation must be established. The quality and accuracy of the count is determined to a large extent by the quality of the source information on which it is based.

Count Resources - Application Expert

FP Analysts who functionally size applications and projects based upon packaged software must rely heavily upon the advice of the application expert to assist in identifying implemented functionality, as well as software package experts who provide advice on package functionality which is implemented without customisation. If such assistance is not available the efficiency and accuracy of the count is negatively impacted.

The responsibilities of the Application Expert, or their representatives are to:

  • Assemble the source information for the count, verify that the documentation provided to the FP Analyst is complete and the most current version is available.
  • Provide the FP Analyst with a system overview on the first scheduled count day.
  • Identify implemented 'Vanilla' functions
  • Assist in identifying and arbitrating on the business view of system functions. For example, mapping physical screens to logical functions, arbitrating on unique vs variant transactions.
  • Answer FP Analyst questions related to application functionality on an as required basis.
  • Review and approve the count scope and detail

The role of the Application expert is crucial to the success of the count.

Count Recording Level

Function Points Counts can be performed at different levels. The count level is determined by:

  • Count purpose
  • Required accuracy
  • Quality of source information
  • Time available to complete sizing activities

Application Baseline Function Point Counts

Very large package based applications are usually required to be sized in a time and cost effective manner. The most appropriate Count Level for the application baseline counts is a hybrid count that combines elements of a Default Complexity Count, and the less accurate, High Level Count.

The Default Complexity Count approach is used for functionality that is adequately documented, for example custom developed and/or modified core functions, or functionality that can be viewed via the online system. The High Level Count approach is used for inadequately documented functions, typically batch functionality and/or "Vanilla" package functions.

The hybrid approach is appropriate when an application baseline function size is required for a large application (>4000 FPs), within a short time frame, and it is envisaged that the count result will be maintained over time. The lack of documentation available for 'Vanilla' and batch package functions precludes the adoption of more detailed counting levels for the baseline counts.

Enhancement Project Function Point Counts

The most appropriate count level for enhancement projects is a, Detailed Linked Count. Enhancement projects tend to be smaller in size than new development projects, or application baseline counts, and, as a consequence, are not subject to the same time and budget constraints. Also the source information required to support detailed function point counting is more readily available.

Use of Attributes

The Attributes feature in SCOPE allows functions on the hierarchy to be flagged so that they may be selectively grouped as a sub-set for analysis and reporting. The sub-set can then be included or excluded in the calculations of a particular function point count. eg. Report the functional size of all transactions that are 'Custom Developed functionality' within the Accounts Payable business area. Other Function Point Repository Tools may have similar features to support selective reporting of count results.

When the purpose for functional sizing is to estimate project schedules, or determine project productivity rates, it is important to acknowledge the unique characteristics of different methods of implementing the required functions for applications based upon packaged software. These include:

  1. "Vanilla" Functions - standard package functions implemented without any customisation
  2. "Vanilla Plus" Functions - standard package functions customised via configuration table definitions by users.
  3. Custom Developed Functions -functions which incorporate custom developed subroutines or are complete custom developed elementary processes. The package usually provides user exits to permit the organisation to develop their own functions, common use modules, system interface etc using a package specific programming language. For example, the SAP, ABAP programming language.
  4. Modified Core Functions - customised functions which are delivered by modifying the underlying source code for standard package functions.
  5. Hybrid Functions - a mix of the above

An attribute called PACKAGE, should be introduced to identify the method used to deliver the function.

ATTRIBUTE CATEGORY

ATTRIBUTES

PACKAGE
  • Standard Package Function (Implemented without change)
  • Standard Package Function (Customised via configuration tables)
  • Custom Developed Function
  • Modified Core Function
  • Hybrid Function

Further options should be added to this attribute as required.

The use of this attribute facilitates the reporting of the count according to the different implementation methods used and therefore allows different productivity rates to be determined.

Identification of Impacted Functionality - Additional Issue Related to Packages.

Packaged applications often include 'filler' fields within tables to cater for future data requirements. When, as a result of an enhancement project, the filler fields are defined, or existing unused package fields are redefined, for example VAT is redefined to GST, this is counted as a logical change and the function which uses the fields is considered impacted.

Development of Functional Hierarchy.

For package based applications a functional model is developed in SCOPE based on the physical structure of the package menus. The functional model will reflect the order, sequence and hierarchical dependence of these menus. Users familiar with the package menu structure should therefore have no difficulty in navigating around the Function Point Count.

It is recommended that FP Analysts adhere to the package menu structures, even when the physical structure may contravene more logical structuring guidelines. An organisation may implement multiple versions of the same package, as different applications, and may wish to compare application counts. Hence, to aid the mapping process, the count structures must be as closely aligned as possible. If the applications are based upon different versions of the packaged software, this will introduce some differences in menu structures at the outset.

For the Custom Developed functions, (Default Complexity) the functional model will be decomposed down to the transaction level.

For Vanilla functions, (High Level) the functional model must decompose to at least level 5. Given that the large packaged applications tend to deliver functionality for many business areas, and for what would normally be multiple standalone applications, the decomposition must go to lower levels than would normally be developed for a High Level count. An example of an functional decomposition based upon a SAP based application is as follows

Level 1 - Business Areas - Office, Logistics, Accounting, Human Resources

Level 2 - Business Functions - Financial Accounting, Treasury, Controlling

Level 3 - Applications - General Ledger, Accounts Payable

Level 4 - Major Modules - Account Management, Document Entry

Level 5 - Major Functions - Manage Account, Manage Account Group

Functional Naming Standards.

Where possible the function point count should follow the functional naming standards used within the packaged software. This will aid in future function identification for enhancement project counts. Functional naming standards must be decided at the outset of the count and documented in the Count Reports to ensure they are consistently used.

The FP Analyst should seek the advice and agreement of the application's expert to the proposed naming conventions.

The following description is intended to be a generic guide to identifying data and transactional functions for applications based upon packaged software.

Data Functions

Identification of Data Functions - ILF's and EIF's.

Most packaged software does not provide documentation that describes the logical view of the data that is stored and accessed. There are usually no logical data models, entity relationship descriptions or attribute lists. Utilities/tools may be available that facilitate the production of a list of physical tables. It is this list that is usually used as the basis for identification and analysis of files. Custom developed tables may not be included in the package list and the FP Analyst must establish whether additional documentation should be obtained.

The structure of the package physical tables is often designed for performance and technical requirements rather than user business requirements. Hence the number of physical tables significantly exceeds the number of logical files.

Within any one physical file list it is assumed that there will be no duplicate files.

Each of the steps that follow, apply, regardless of how the function has been delivered. The main difference lies in the source information used for the assessments.

Step 1 Identification of Unused Tables

Ideally the physical file list should included only those files that have been used by the application being counted. Application baseline counts reflect production systems, therefore 'used' files are those that contain data records. 'Unused' files on the other hand are files with zero data records plus those identified by the application expert as being 'unused'.

In some cases systems statistics can provide information on file/table update activity. Files that have been unused for a significant period of time are candidates for further investigation. Care should be taken before eliminating tables with no update activity as some reference tables contain very stable data.

The application expert plays an important role in identifying system utilities that may be used to produce file lists, update statistics and file record counts. Ultimately the application expert should determine which data tables are 'used/unused'.

Step 2 Identification of Technical Tables

Based upon file name alone, identify tables that are assumed to perform a technical function only. This category would include the following types of tables:

  • Backup Files, Sort Files, Temporary/work files
  • Audit Files - IFPUG counting guidelines do not permit the counting of audit files as separate logical files.
  • Report/Extract Files - Assume that these files are either temporary files or counted as External Outputs
  • Load Files - exclude if there is a matching Master file with the same name
  • Codes Tables - Assume that any table named a Code Table contains only 2 attributes - Code Id and Code Description.

Step 3 Grouping Related Tables

Based upon file name alone, identify tables that are assumed to belong to the one logical entity. This category would include the following types of tables:

  • 'Header' and 'Detail' files - Assume that if the two tables have the same name they belong to the same logical file.

Step 4 Other Assumptions

Based upon file name alone identify the following types of tables and include them within the scope of the count:

  • History Files - History files that have the same file definition as the original file, should not be counted separately, however table attributes are not available therefore assume the history files store summarised data which cannot be derived from other files.
  • Statistics Files - Assume the statistics files store summarised data which cannot be derived from other files
  • 'Dummy' Files - Files that do not support the main business functions of the application being assessed, but have data recorded within them to support generic package processing. For example, within SAP, GL Files that are primarily maintained by one SAP application, may also be updated by another SAP application within the same organisation to satisfy basic SAP financial requirements.

Investigation of the logical groups of data maintained by elementary business processes can also assist in identifying logical files. If a series of transactions, maintains and reports on the same group of data then it is assumed that the data is a logical file. For example, the count may identify the following transactions - - Create Bank, Change Bank, Display Bank, Mark Bank for Deletion, Display Changes to Bank. This group of transactions identifies a logical group of data called BANK. Where the maintenance and reporting functions are Custom Developed, and the file BANK does not appear in the physical file lists (for custom developed functions) it should be added to the System/Count File List.

Step 5 Files Included Within Count Scope

Assume that all remaining physical tables are logical files and included within the count scope.

If possible group files by category/functional model. Investigate the application standards for file naming and follow these were possible. For example different file groups; 'vanilla' package files, custom developed files, may be prefixed by an identifier.

Step 6 Validation of File Contribution

Most package based applications exhibit standard Management Information Systems (MIS) characteristics therefore the percentage contribution of logical files for the package based application is expected to reflect Industry figures. The International Benchmarking Standards Group (ISBSG) Release 6 repository data suggests that logical files (i.e. both ILFs and EIFs) should comprise 27% of application functionality for standard MIS applications.

At the completion of the count, determine the % contribution of the logical files currently counted ('Vanilla' and Customised) and compare to the industry figures. If there is a significant difference this must be further investigated.

Where the files contribute a significantly lower percentage than industry norms it is possible to introduce an Adjustment file (with appropriate multiplier) to the count file list and aggregate the additional function points in this way.

Classification of Data Functions

The classification of logical files is determined by the way in which they are used by transactions. If the transactions:

  • update files, the derived file type is Internal Logical File (ILF).
  • only read from the files, the derived file type is External Interface File (EIF).

Neither Default Complexity, nor High Level counts require transactions to be linked to files. Therefore, file type cannot be derived from linked transactions.

Once the file list for the application has been finalised, the application expert together with the FP Analyst will review the list in order to identify any External Interfaces Files. It is expected that only the Custom Developed logical files, (ie typically reference tables loaded from another system), will satisfy the criteria for EIF's. Within standard MIS systems, EIF's contribute from 1-4% of the total logical files.

Only those files that clearly satisfy the conditions of an EIF should be assigned this type. All other files are counted as ILF's.

Assessment of Data Functions

Most software packages do not provide documentation to allow FP Analysts to accurately determine the number of Data Element Types (DETs) and Record Element Types (RETs) on a logical file.

Both Default Complexity, and High Level counts assign a 'low' complexity rating to logical files as per industry profiles.

Enhancement Projects will be counted as Detailed Linked counts. For these counts, file complexities will be determined provided the required source information is available to make the assessment.

However for application baseline counts, if it is obvious that several physical files belong to the one logical file, the physical files are regarded as RETs on the ILF and a complexity of AVERAGE or even HIGH may be assumed.

Home


Transactional Functions

Identification of Transactional Functions

There are three groups of transactions that must be identified:

  • On-line/Screen
  • Batch
  • System Interfaces (usually a special category of batch functions)

Each of these transaction types can be delivered via:

  • 'Vanilla' functions
  • 'Vanilla Plus' functions
  • Custom Developed functions
  • Modified Core Functions
  • Hybrid functions - a mix of the above

The source information from which the functions are identified depends upon the type of function, and how it is delivered. For example, a 'Vanilla' on-line/screen function may only appear in a list of physical transactions, or menu decomposition, and there will be no other supporting documentation. A Custom Developed interface function may not appear in a package physical transaction list but may have a multi page specification associated with it.

Within some software packages (for example SAP) there is the ability to list all the physical transactions. This transaction list corresponds to:

  • business transactions (eg. SAP CRC1 Create Resource),
  • menu screens (eg. SAP CS01BOM Menu screen),
  • data load programs (eg. SAP CRT3 Upload C Tables Resource).
  • user friendly technical functions (eg Copy functions)
  • purely technical functions (eg Backup, Submit functions)

The advantage of using such a list is that there are no duplicates, and all functions, regardless of which business group has update responsibility, and in which module the function appears, are included. The disadvantage is that transactions cannot be easily assigned to their functional modules (without the assistance of the application expert) and hence the relative size of different functional modules cannot be determined.

Such a list should be used as a primary information source only when other sources are not available. The ability to produce a physical transaction list depends upon the software package. For example, SAP has the Dynamic Menu Tool. This is a very valuable aid as it permits the user to decompose the SAP menu structure to the physical transaction level, including only selected branches. The resulting menu decomposition can then be used to identify implemented modules and functions. Also custom developed screens are included in the decomposition.

Custom developed functions tend to be batch functions and system interfaces. Lists of custom developed batch functions and system interfaces, and accompanying specifications are usually available.

Each of the steps that follow, apply, regardless of how the function has been delivered. The main difference lies in the source information used for the assessments.

Step 1 Identification of Unused Functions

Using either a menu decomposition or physical transaction lists, the application expert and FP Analyst must identify implemented functions.

Where lists of Custom Developed functions (screens, interfaces and batch functions) are available it is assumed that all functions on the list are used.

Step 2 Identification of Non-Business Functions

Based upon transaction name alone, identify transactions that are assumed to perform a technical function or navigation function only and exclude them from the count scope. This category would include the following types of transactions:

  • Technical Functions - Backup, Sort, Archive transactions
  • User Friendly functions - Copy transactions
  • Menu screens and PF keys which permit navigation
  • Load transactions - which do not perform any business processing but simply copy manage files
  • Codes Tables maintenance functions - If a table is identified as a Codes table and excluded from the logical file list, its corresponding maintenance functions should also be excluded.

These transactions should be deleted from the menu decomposition/ transaction list.

Step 3 Identification of Duplicate Transactions

Duplicate transactions should not occur in physical transaction lists. They will occur in menu decompositions. Reused functions can appear in multiple menus. This is particularly true for SAP based applications.

When performing a High Level count using menu decompositions, the only way the duplicate functions can be identified is by the application expert, or if the FP Analyst recognises the reused module/transaction names while analysing and tallying transactions. Duplicate status can be proved at the transaction level if the transactions have the same transaction Id. (This is true for SAP based applications where all transactions are prefixed by a four character Transaction Id.)

Duplicates should not occur within lists of Custom Developed functions used for Default Complexity counts. For example, duplicates would not be expected in a list of System Interface functions except where they are listed under batch jobs as well as under report numbers.

Duplicates can occur in Batch program lists, particularly when the lists are related to a calendar of events. For example the same program may occur in the daily, weekly and monthly run lists. Only the application expert can identify the batch duplicates.

In all cases transactions should occur only once in transaction lists/menu decompositions and the duplicates should be crossed out so that they are not double counted when recording and 'tallying' transactions.

Step 4 Expansion of Generic Functions

Some screen functions have generic names, for example Create Document, Display Document. Where such transactions are identified, application expert advice should be sought to determine whether different document types trigger different logical processing.

Also screen names prefixed with the verb 'Maintain' should be investigated to see if they support multiple functions.

When conducting High Level counts, time constraints may not permit this degree of investigation, however the investigation is warranted for Default Complexity counts.

When counting batch functions, one batch 'job' may involve multiple elementary processes. FP Analysts should seek application expert advice when analysing lists of batch jobs/programs/'runs'.

Step 5 Grouping of Related Functions

Based upon transaction name alone, identify transactions that are assumed to belong to the one logical transaction. This category would include the following types of transactions:

  • 'Select/Find' and 'Display/Print' transactions - Assume that if the two screens have the same name they are the parameter input, and data output, components of the same logical data extract function.
  • 'Header' and 'Detail' Screens - Assume that if the two screens have the same name they belong to the same logical transaction.
  • 'Submit' screens - Assume that such screens trigger a batch process eg report production, which will be identified elsewhere.

Step 6 Transactions Included Within Count Scope

Assume that all remaining physical transactions are logical transactions and include them within the count scope.

Group transactions by category/functional module. This may not be possible for batch and system interface functions, and they may appear as a separate component within the functional decomposition.

Uniquely list Default Complexity, Custom Developed transactions on the Functional Hierarchy with a Transaction Id prefix

'Vanilla' package functions are grouped together under generic transaction group names and included in the Functional Hierarchy, linked to their functional modules, with an abbreviated transaction prefix, derived from the transaction identifier (where transaction identifiers are used in the package software). The 'multiplier' field is used to record the number of transactions within the group. Generic transaction group names are:

  • On-Line Update
  • Batch Update
  • Data Load
  • Reports
  • Enquiries
  • Extracts

 

Home


Classification of Transactional Functions

The type of each transaction function is determined primarily from the function name. However, when using a menu decomposition, transaction names may only comprise a Transaction Id and a single verb, noun or adjective, for example, "Display", "Changes", "Resubmission". The FP Analyst should refer to the preceding hierarchy level as, this level name, in combination with the transaction level name, allows the transaction to be classified.

In some cases transaction type is not obvious from the transaction name or the screen accessed. For data extraction functions, (EO and EQ) a decision is made based upon the expected output. Where a clear distinction cannot be made between an EI or EO/EQ assume the transaction is an EI.

For particular types of transactions, terms used in the physical transaction name are used to determine transaction type. The following standards apply:

  • "Analysis"
EO: derived data assumed
  • "Monitor"
EO: assume the purpose of the transaction is to report a derived result
  • "Calculate"
EO/EI
(a) assume that the purpose of the calculation is to report results rather than to maintain file data - count as an EO.

(b) where data is calculated and stored on file where it is accessed by several other functions it should be counted as an EI

  • "Display"
EQ: (Usually) Where the name indicates that the output includes derived assign data transaction type EO.
  • "List"
EQ: assume that all list transactions are legitimate user required business functions and not merely selection screens for subsequent transaction processing.
  • "Activate" "Assign" "Flag" "Post"
EI: assume that transactions initiate processing which resulted in the update of data within ILFs, generally by changing the status of the record.
  • "Maintain"

EI: as a default assume this equates to two logical functions, Create and Modify


Assessment of Transactional Functions

For most package based functions the complexity of the transactions cannot be accurately evaluated ie. the exact number of DETs entered/updated/displayed and particularly the File Types Referenced (FTRs) cannot be determined.

Both, Default Complexity, and, High Level counts assign AVERAGE complexity rating to transactions as per industry profiles.

Enhancement Projects are conducted using Detailed Linked counting. For these counts, transaction complexities will be determined provided the required source information is available to make the assessment.

Home

Issue Description

Functional Overview

General Discussion and Resolution

Transaction Functions

Classification of Transactional Functions