Fraud and Fraud Detection. Gee Sunder

Чтение книги онлайн.

Читать онлайн книгу Fraud and Fraud Detection - Gee Sunder страница 7

Fraud and Fraud Detection - Gee Sunder

Скачать книгу

That is, the cause is the independent variable that impacts the dependent effect. An example would be citing the amount of rainfall as causing the growth of grass; care must be taken, as many events appear to be associated but may not actually have a cause-and-effect relationship.

      Online analytical processing (OLAP) tools are frequently used with the CDA process. They allow the user to extract data selectively and view the data from different perspectives or dimensions.

      QDA is used to draw conclusions from nonquantitative or non-numerical data such as images or text. While typically employed in the social sciences, it can be used in organizational audits of controls, procedures, and processes.

      Data analytics provide insight into the dataset, discover underlying data relationships and structures, test assumptions and hypothesis, identify variables of causal relationships, and detect anomalies.

      

DATA ANALYTICAL SOFTWARE

      There are a number of software programs that analyze data. Software such as Microsoft Access12 or Microsoft Excel13 is familiar to many people and used by many businesses and individuals. Indeed, Excel is favored and frequently use by accountants and auditors. Access and Excel is suitable when the dataset is not large and the analysis not complex. While it is possible to do more complex procedures, many steps are necessary. The user may need to perform operations and formulas that are not commonly used.

      These products are not recommended as professional analytic tools; their complex functions are time consuming to learn and lack data integrity. It is easy to inadvertently change the content of cells by accidentally touching the wrong key. Processing speed is also slow and can be cumbersome when applied to large amounts of data.

      Professional or dedicated data analysis software, such as ACL,14 Arbutus,15 and IDEA16 are specifically designed for use with large and very large data sets. Features of this type of software include:

      • The data source is protected

      • Can provide quick analysis

      • Retains audit trails

      • Built-in data analytical functions

      • User friendly

      • Can import from various data sources and file formats

      • Able to analyze 100 percent of transactions

      • Field statistics

      • Various types of sampling techniques

      • Benford’s Law analysis

      • Correlation and trend analysis

      • Drill-down features

      • Aging

      • Stratification

      • Fuzzy matching

      • Sophisticated duplicate testing

      • Auto-run or automated procedures

      ActiveData17 is an Excel add-in that has data analytical capabilities. It is a cross between Excel/Access and the more powerful data analytical software. It is feature-rich with an attractive low price.

      

ANOMALIES VERSUS FRAUD WITHIN DATA

      Data anomalies are a fact of life. There will always be inconsistencies, abnormal, or incorrect data residing in databases. This is quite normal. Database anomalies could be the result of missing or unmatched information caused by human error and flaws and limitations in the database. Bugs in the database can occur whenever a record is entered, modified, or deleted.

      Insertion anomalies occur when data is being entered into the database. One form of this anomaly is where the information cannot be entered until additional information from another source is entered. A new employee’s shift scheduling cannot be entered until the employee has a payroll number. The payroll number may not be assigned immediately as the new employee’s first pay will not occur until two weeks from starting employment.

      Data must be entered in a format that is consistent. The most common insertion errors are missing or incorrectly formatted entries. Well-designed software should have error-checking capabilities that provide an error message and prevent recording of the record if there is a blank entry where data is expected. Error checking or validation should also prevent an entry that does not fall within an acceptable range. For instance, the program would not accept a number outside of 01 to 12 where the field is a numeric month field. It may not accept a single digit for a month if the validation was designed to require a leading zero where the month normally would be a single digit. This helps to reduce errors where the operator meant to enter 12 but entered a 1 instead.

      Deletion anomalies occur when the last record for a particular group of information is deleted. Removing that record may remove relevant information associated with the record. The deletion of information or facts about one entity automatically deletes other facts regarding that entity.

      Let’s say an employee has left to work for another employer. The former employee shift schedule information is deleted but the associated address information might also be contained in that last record for the employee. Where would the employer send the employee’s last paycheck or accumulated vacation pay?

      Modification or update anomalies are where incorrect information needs to be changed that may require many other record changes. This leads to the possibility that some items may be updated incorrectly.

      When we analyze data for anomalies for fraud items, we are not interested in insertion, deletion, or modification anomalies caused by the business systems (other than to note poor system designs that lead to internal control problems). What we are interested in are unexpected or strange items, such as outliers or too many inliers. We target suspicious transactions or transactions that are too typical to be natural. We look at the unusual in relationship to the usual.

      Anomalies in datasets will be common. Most will be errors and very few, if any, may pertain to fraud. It is unlikely that any fraud can be proven solely based on analyzing data. Analyzing data to identify anomalies or patterns gives the auditor or investigator a starting point of where to do further analyses. One must follow the audit trail to review source documents and supporting factors that lead to the records to review.

      It is important to employ professional skepticism at this point by:

      • Critically assessing the anomalies without making a conclusion.

      • Having no biases caused by being overly suspicious or cynical.

      • Not accepting evidence or information gathered at face value.

      • Ensuring that all evidence or information is complete.

      • Pursuing the facts through the critical review of documents associated with the data anomaly.

      • Assessing whether information provided by staff lacks objectivity or there is lack of knowledge.

      What

Скачать книгу


<p>12</p>

Microsoft Office, “Access Database Software and Applications,” accessed July 13, 2013, http://office.microsoft.com/en-ca/access.

<p>13</p>

Microsoft Office, “Spreadsheet Software, Microsoft Excel,” accessed July 13, 2013, http://office.microsoft.com/en-ca/excel.

<p>14</p>

“ACL Compliance, Audit, Governance & Risk Software – Data Analytics and Cloud-Based GRC Management,” accessed July 13, 2013, www.acl.com.

<p>15</p>

“Arbutus Audit and Compliance Analytics, and Continuous Monitoring Solutions,” accessed July 13, 2013, www.arbutussoftware.com.

<p>16</p>

“IDEA–CaseWare International,” accessed July 13, 2013, www.caseware.com/products/idea.

<p>17</p>

“InformationActive.com – Data Analytics for Excel and SQL Databases,” accessed July 13, 2013, www.informationactive.com.