Analytics Scripting PDF
Analytics Scripting PDF
0
Help
Glossary 2725
Glossary of Galvanize product terms 2726
Using Analytics
Analytics provides you with a wide-ranging set of tools for working with data. Beginning with the
import of data, Analytics gives you numerous options for progressing through the data analysis cycle.
Analytics does not impose or require any particular data analysis workflow. The commands,
functions, and other tools in Analytics can be assembled into whatever workflow you arrive at for
analyzing a particular data set and achieving your analysis objectives.
That said, understanding the general data analysis cycle can help you structure your work in
Analytics.
Import the data You must import data into Analytics before you can analyze it.
Often you must perform one or more data preparation tasks before data is ready to
Prepare the data analyze.
You perform analysis in Analytics by using commands and other tools to gain
general insights about the data you are investigating, and to answer specific
Analyze the data questions.
Once your data analysis is complete, Analytics gives you several different ways to
Report the results report or present your results.
Tip
To get an understanding of how the data analysis cycle can work in Analytics, do the
introductory tutorial: "Getting started with Analytics (non-Unicode edition)" on
page 35
Getting started
This section of the Analytics Help provides a variety of introductory and overview information,
including:
"What is Analytics?" on the A high-level overview of Analytics features, and the end-to-end process of
facing page analyzing data using Analytics.
"What is ACL for Windows?" Information about the ACL for Windows installation package and the ACL for
on page 32 Windows main screen.
"Get help with Analytics" on Where to go for help as you are using Analytics.
page 103
"The Analytics user interface" An overview of the Analytics interface, including customizable elements.
on page 106
"Analytics projects" on Information about Analytics projects, which you use to contain and organize
page 163 your work in Analytics.
What is Analytics?
Analytics is a data analysis application that provides a powerful combination of features:
l Data access
Import a wide variety of different types of data from file-based data sources, databases, or
cloud data services.
l Data analysis
Use Analytics commands, functions, and other tools to gain general insights about the data you
are investigating, and to answer specific questions. You can perform data analysis ad hoc with
the user interface, or automate your analysis using ACLScript, the powerful scripting language
in Analytics.
l Reporting
Report your findings using native reporting features in Analytics, or import Analytics data into a
third-party reporting tool such as Tableau.
l Export capabilities
Export findings, or any other data, to popular file types such as Excel or delimited text. You can
also upload records to the Results app in the HighBond platform for processing and issue
remediation using workflow automation tools, and for data visualization.
Basic workflow
The diagram below shows the basic workflow associated with Analytics:
1. Import data to an Analytics table
2. Analyze the data and output a results table
3. Export the analysis results, or import them to a reporting tool
This step is optional. You can also use native reporting features in Analytics.
Note
Access to each component is determined by your Galvanize subscription type.
Components
Number Component Description
1 Organization Switch between accounts (organizations) you have access to using this drop-down
selector list.
2 Toolbar Profile – Update your profile or sign out of ACL for Windows.
Information – Access help and product documentation or contact Support.
3 Recent
View recently accessed files in Analytics. Click Refresh List to update the list of
Analytics
files, or press F5.
Files
5 Create o Analytic Project – Create a new Analytics project and open it in Analytics.
o Workflow – Open collections in Results for viewing, or build a new workflow for
organizing, tracking, and remediating exceptions
6 Sample Files Open pre-built Analytics projects that include a variety of sample data.
1. In ACL for Windows, select Sign Out and close from the profile drop-down list .
You are signed out of your current instance.
2. Double-click the ACL for Windows shortcut on the desktop.
The Launchpad sign-in screen opens.
3. Sign in using your HighBond account, by entering your user name (email) and password and
clicking Sign In.
4. Select the appropriate instance from the drop-down list and click Activate Analytics .
ACL for Windows opens. Any activities you perform involving HighBond now use the instance
you just selected.
Do this version of the tutorial if you're using the non-Unicode edition of Analytics.
Do the right version of If you're using the Unicode edition, do "Getting started with Analytics (Unicode
the tutorial edition)" on page 69.
Tip
To find out which edition of Analytics you're using, on the Analytics main menu, click
Help > About to open the Analytics dialog box. The edition designation appears after
the version number.
Note
The Chinese and Japanese user interfaces are Unicode-only.
Scenario
After you've analyzed the data, you want to present the results of your analysis visually, to
better engage your audience.
Optional section
You're told that from now on, reviewing corporate credit card transactions will be a recurring
responsibility.
To allow yourself, or someone else, to perform future reviews quickly and accurately, you
decide to create a script to automate some of the work.
Planning guidelines
Develop clear, specific objectives
What is the intended end product of your analysis?
You need clearly defined objectives in order to be able to plan how to achieve them. For example, in
this tutorial, your specific objectives are:
l identify the count, and total amount, of corporate credit card transactions in each merchant
category
l identify any transactions in prohibited categories
Map a step-by-step approach
How will you achieve your objectives?
Accomplishing an objective often requires more than one step, so map a detailed, step-by-step
approach to guide you along the way.
For example, two of the steps in the planning for this tutorial could be:
l combine all the individual transaction files into a single file
l group the combined transaction data into merchant categories
Once you've broken down the larger objectives into individual steps, you can consider which
Analytics features and functions to use to perform each step.
Identify what data you'll need
What data do you need to achieve your objectives?
Itemize the required source data to the level of specific data elements or fields. You won't be able to
achieve your desired output without the appropriate input.
In this tutorial you have the main transaction files, but to achieve your second objective you'll also
need a list of prohibited merchant category codes.
Consider technical requirements
Are there any technical considerations you must take into account?
Regardless of which tool you're using for data analysis, you must work within its constraints. Is the
source data stored in a location or a system that the tool can access, and in a format that it can read?
Is the analysis you're proposing supported by the tool?
For example, in order to combine multiple tables in Analytics, the data types of the corresponding
fields in each table must be the same. Analytics supports changing the data type of a field, but that's a
step you need to account for in your planning.
Be prepared to iterate
You may need to adjust your plan as you go along.
In the course of your analysis, you discover something unexpected that warrants further investigation.
You realize you need additional data and additional analytical steps.
Your plan can evolve as your understanding of the data evolves. And it can serve as the basis for a
more mature plan for future analysis of a similar nature.
Import data
You must import data into Analytics before you can analyze it.
We'll familiarize with the import process by using the Data Definition Wizard to import three Excel
worksheets. Importing from Excel is one of the most common methods for acquiring data for analysis
in Analytics. However, Analytics supports importing data from a wide variety of data sources.
Steps
1. Double-click the ACL for Windows shortcut on your desktop.
2. In the ACL for Windows screen, under Open, click Analytic Project.
3. Navigate to C:\Users\user_account_name\Documents\ACL Data\Sample Data Files
and double-click Sample Project.ACL.
Sample Project.ACL opens in Analytics.
If you did not install the Sample Data Files folder in the default location when you installed
Analytics, navigate to the location where you installed it.
l Trans2_May$
5. Make sure Use first row as Field Names is selected, click Next, and then click Finish.
The two Excel worksheets are imported into two separate Analytics tables.
Note
You're making adjustments to a data field in the Data Definition Wizard, during
the import process. You can also make adjustments later, after you have
completed importing the data. You'll see the reason for the adjustments in the
next section of the tutorial.
6. Click Next, in the File name field type Trans_Apr, and click Save.
7. Click Finish, and then click OK.
The third Excel worksheet is imported into an Analytics table.
You should now have three new Analytics tables in the Overview tab of the Navigator. These tables
contain read-only copies of the Excel data. They do not contain the Excel source data itself.
Prepare data
Often you must perform one or more data preparation tasks before data is ready to analyze.
For this tutorial, you'll perform two preparation tasks:
Key point
The time you spend importing and preparing data may exceed the time you spend on the
actual analysis. However, they are critical initial stages, and provide the foundation that your
analysis is built on.
CODES ASCII
AMOUNT PRINT Enter 2 in the Dec. field to specify that numeric values
display two decimal places.
When you're finished, the May table layouts should look like the layout below.
Note
The date format (YYYY-MM-DD) isn't shown in the layout summary. The
DESCRIPTION field length is different in the two May layouts.
Note
We're verifying the data after updating the data types. When you verify data in
Analytics, you're checking that all the values in a field conform to the requirements of
the field's data type. So it makes sense to verify data only once the data types are
finalized.
Steps
1. Open the Trans_Apr table.
2. From the Analytics main menu, select Data > Verify .
3. In the Verify dialog box, select all the fields in the field list.
Tip
Use Shift+click to select multiple adjacent fields.
4. Click OK.
The result should be: 0 data validity errors detected.
Learn more
Did you notice that Analytics automatically translated the action you performed in the
user interface into the ACLScript VERIFY command? Every command-level action you
perform in the user interface is automatically translated into its corresponding ACLScript
command, and captured and stored in the command log that accompanies each
Analytics project.
This automatic generation of valid, runnable script syntax is one of the most powerful
features in Analytics. We'll be looking at scripting in an optional section at the end of the
tutorial.
5. In the Navigator, double-click the Trans1_May table to open it, and repeat the steps to verify
the data.
6. Do the same for the Trans2_May table.
Both tables should not contain any data validity errors.
Note
If you get an error message stating Maximum error limit reached, check that
you correctly changed the format of the Date field in the table layout to YYYY-
MM-DD.
Learn more
If you want to see what happens when Analytics does identify data validity errors, open
Tables\Badfile and run the verification process.
4. Select Use Output Table so that the output table with the combined data opens automatically
after you run the command.
5. In the To field, type Trans_All and click OK.
Note
Don't worry about the notification. The append command performs some
automatic harmonization of numeric fields, which saves you time and effort.
The new Trans_All table is created, and contains all the records from the three input tables.
The record count in the status bar at the bottom of the Analytics interface should say
Records: 481.
You're now ready to move on to some actual data analysis.
Analyze data
You perform analysis in Analytics by using commands and other tools to gain general insights about
the data you are investigating, and to answer specific questions.
Note
The analysis stage is where the strength of your earlier planning becomes apparent.
If you've formulated clear objectives regarding your investigation, you'll have a clearer
idea of the types of analysis to perform.
3. In the Summarize dialog box, select the following fields and options:
Tab Field or option Select or type
4. Click OK.
The new Trans_All_Grouped table is created. The table contains 110 records, one for each
unique merchant category code in the Trans_All table. The COUNT field tells you how many
source records are in each group.
Tip
Right-click the table view and select Resize All Columns to make the view
more compact.
Where did employees o Right-click the Total AMOUNT header and select Quick Sort Descending
spend the most money?
The Description field shows you that the most money was spent on:
o Caterers
o Eating places and Restaurants
o Hilton International
What were the largest o Right-click the Maximum AMOUNT header and select Quick Sort Descending
single expenditures?
The Description and Maximum AMOUNT fields show you that the largest single
expenditure was a Club Med amount of $1999.06.
Is Club Med an authorized merchant code for the corporate credit card? If the credit
card limit is $2000, was an employee charging an amount just under the limit?
What does an o Right-click the COUNT header and select Quick Sort Ascending
examination of
Five categories had only a single charge each. Are some of them prohibited
infrequently used codes
categories? Perhaps one or more employees thought that misusing a company
reveal?
card only very occasionally would allow them to escape detection.
o Cigar Stores & Stands
o Dating & Escort Svcs.
o Babysitting services
o Amusement Parks
o Civic, Fraternal, and Social Associations
Are any of the categories o Right-click the DESCRIPTION header and select Quick Sort Ascending to
prohibited? alphabetize the field values for easier scanning
o Scan down the field looking for suspicious categories
Perhaps one or more of these categories are prohibited?
o Babysitting services
o Betting (including Lottery Tickets, Casino)
o Civic, Fraternal, and Social Associations
o Dating & Escort Svcs.
o Massage Parlors
o Precious Stones and Metals, Watches and Jewel
o Video Game Arcades/Establishments
Note
Manual scanning is impractical for all but small data sets. We'll look
at a more practical, more reliable method next.
Learn more
Perhaps you just want to perform some quick analysis and you don't want to output the results
to a new table. When you summarized the Trans_All table, instead of selecting File in the
Summarize dialog box, you could select Screen, and output the results to the
Analytics display area.
Outputting to screen is only practical for smaller data sets. However, it has the advantage of
providing an easy way to drill-down on individual groups and see only the source records in
each group.
A general review of the corporate credit card transactions alerted you to some possible prohibited
transactions. You decide to confirm whether any transactions are prohibited by matching a list of
prohibited merchant category codes against the data.
Steps
Note
Make sure you copy the entire string, including all quotation marks.
l Click Edit View Filter to open the Expression Builder, double-click the filter name in the
Filters list, and click OK.
Tip
The Filter history list holds a maximum of 10 filters, so at times you may need
to use the Expression Builder method for reapplying a saved filter.
Learn more
Beyond filters
Filters work well if the number of criteria or conditions contained by the filter are manageable.
The filter you created in this tutorial contains only 9 codes. But what if your list of prohibited
merchant category codes was several dozen, or more?
A more efficient approach would be to join an Analytics table containing the prohibited codes
with the transactions table. Every match in the joined output table would be a prohibited
transaction.
Joins are beyond the scope of this tutorial, but they are a frequently used feature in Analytics.
Report results
Once your data analysis is complete, Analytics gives you several different ways to report or present
your results.
Traditional reports with columns of data are available, but we'll look at conveying results using the
more engaging data visualization described below.
Treemap visualization
This treemap visualization shows the grouped credit card transactions you output in the Trans_All_
Grouped table. The relation between groups is conveyed in two different ways:
l size of the box – indicates the count of individual transactions in each group
The larger the box, the greater the number of transactions. The boxes are arranged in size
from top left to bottom right.
l color intensity of the box – indicates the total amount of each group
The darker the box, the greater the total amount.
So, for example, the size of the Club Med box, in the bottom right quadrant, indicates only a small
number of transactions, but the color indicates that the total transaction amount is significant.
Steps
1. Go to Launchpad (www.highbond.com).
2. Enter your HighBond account credentials (e-mail and password) and click Sign In.
Launchpad opens.
3. Click Results .
The Results homepage opens.
Note
If you cannot access Results, you may not be assigned an appropriate
subscription type or Results role. Use one of the alternative report creation
methods listed in "Other reporting methods in Analytics" on page 57.
If you would like to access Results, contact your company’s Analytics account
administrator.
Create a Collection
Steps
1. From the Results homepage, click New Collection.
2. On the New Collection page, in the Name field, enter or copy ACL Tutorial.
3. At the bottom of the page, click Create Collection.
The Collection settings page opens.
Create an Analysis
Steps
1. At the bottom of the Collection settings page, under What's Next?, click create your first Data
Analysis .
The Analysis Details page opens.
2. On the Analysis Details page, in the Name field, enter or copy Sample Report.
3. Click Create Analysis .
The new ACL Tutorial Collection opens with the empty Sample Report Analysis that you just
created.
Note
Leave Results open. You will be coming back to create the data visualization.
4. Click To, and in the Select Destination Test dialog box navigate to the Sample Report
Analysis container you just created and double-click to open it.
5. In the New data analytic field enter or copy Trans_All_Grouped and click Create.
You are returned to the Export dialog box and an ID number and data center code are prefilled
in the To text box.
6. Click OK.
The data in the Trans_All_Grouped table is exported to Results.
4. In the Configure Visualization panel, select the fields and options shown below.
Note
5. Click Apply .
The Treemap visualization is generated.
You can hover your mouse over the individual boxes in the treemap to see the embedded data.
If you change the size of the browser window, the treemap dynamically updates by
repositioning boxes, and by displaying and suppressing a different selection of associated
descriptions.
Steps
1. Click Untitled at the top left corner of the Treemap visualization and type a title for the visualiz-
ation such as Transaction Treemap and press Enter.
2. Click Save > Save As .
3. In the Title field, type a name for the interpretation such as Tutorial visualizations and click
Save.
The interpretation and the visualization are both saved and can be reopened later.
4. Click the name of the collection, ACL Tutorial, in title bar to return to the Sample Report
Analysis container.
5. Click View Interpretations . The Interpretations dialog box appears and notice that it lists the
newly created interpretation, Tutorial visualizations.
You can create multiple visualizations and interpretations in each Analysis container. Each
visualization is based on the data in the Table View.
Data visualizations in the Analysis The data visualization capability in Results is also available locally in
App window the Analysis App window, a freestanding component of Analytics.
Note
Some of the charts and visualizations available in Results
may not be available in the Analysis App window until a
new version of Analytics is released.
Legacy Analytics charts Analytics contains a legacy charting and graphing capability that allows
you to create basic visual reports.
For more information, see "Working with Analytics graphs" on
page 1330.
Traditional columnar reports In some cases, a traditional text- and number-based report with rows
and columns of data is all you need.
For more information, see "Formatting and generating Analytics reports"
on page 1324.
Third-party reporting tool You can use a third-party reporting tool such as Tableau or Microsoft BI
and import data directly from Analytics.
For more information, see "Connecting to Analytics from a third-party
reporting application" on page 1342.
Exporting data to Excel or CSV You can export data to Excel, or to a comma-separated file, and use the
reporting capabilities of Excel, or of any tool that can work with a CSV
file.
For more information, see "Exporting data" on page 203.
You're finished
Congratulations! You've completed your end-to-end introduction to analyzing data using Analytics.
Where to next?
You have several options for continuing to learn about Analytics:
Academy offers a range of courses for various experience levels. ACL Analytics
Foundations Program is a series of six mini-courses that teaches Analytics basics for new
users.
Academy is the Galvanize online training resource center. Go to the course catalog to see
the available courses.
Academy Academy courses are included at no extra cost for any user with a subscription.
You're currently in the Analytics and ACLScript Help. The Help provides reference-style
conceptual material, step-by-step instructions, and ACLScript syntax for all aspects of
Analytics.
For example, here are the Help topics for the append operation, which formed part of the
tutorial you just completed:
o "Appending tables" on page 859 (conceptual)
Analytics and o "Append tables" on page 869 (step-by-step instructions)
ACLScript Help o "APPEND command" on page 1676 (ACLScript syntax)
You can gain a lot of value using Analytics in an ad hoc or manual fashion without ever writing a script.
For the most part, anything that can be done in a script can be done in the user interface, and vice
versa. However, to gain the most value, power, and efficiency from Analytics, you need to script.
The good news is that Analytics provides tools to make scripting relatively easy, even for a novice.
Save time
The basic review process is standardized. With each review cycle, you can spend time
repeating the basic process manually, or you can save time by automating the process.
What is a script?
An Analytics script is a series of ACLScript commands that perform a particular task, or several
related tasks. For example, everything that you just did manually in the first part of this tutorial could
also be performed using a script.
ACLScript is the command language that forms the basis of Analytics. Scripts are stored in
Analytics projects. Individual scripts appear in the Navigator, and are prefaced by the script icon .
2. In the log, locate and click the SUMMARIZE command that outputs results to a new table.
The command prefills the Command Line near the top of the Analytics interface, just below the
toolbar.
Note
If the Command Line isn't visible, select Window > Command Line from the
Analytics main menu.
Note
We're going to skip over some scripting best practices in order to keep this
introduction to scripting brief. The goal is to demonstrate how easy it is for even new
users to create scripts in Analytics.
Steps
1. In the log, locate and select the following commands:
You're adding _2 to avoid name conflicts with the table and filter you already created manually.
Note
You can also run a script by right-clicking it in the Navigator and selecting Run.
A script does not have to be open to be run.
Note
The script assumes that the Sample Data Files folder is installed in the default
location. If the folder is installed in a different location, you need to modify the
navigation paths in the script to point to the correct location.
The tables created by the script are appended with _s so that they don't overwrite the
tables you created manually.
Steps
Note
It's important that you select the entire script and don't miss any lines.
Alternately, you can download a text file with the script here: Getting started
tutorial (non-Unicode edition)
3. Click in the Script Editor window and press Ctrl+V to paste the script syntax into the empty
Getting_Started_tutorial script.
2. Click Save the Open Project , and click Yes in the prompt that appears.
If you do not find the save icon, select Window > Toolbar in the Analytics main menu to enable
the toolbar.
Note
If you haven't worked with scripts before, the script syntax may look overwhelming at
first. Keep in mind that almost all the syntax was simply copied from the Analytics log.
The syntax for the interactive notifications in the script (DIALOG commands) was
auto-generated by another relatively simple Analytics tool.
The green COMMENT commands walk you through the script at a high level. You'll
recognize the tasks that you just completed in the preceding tutorial.
COMMENT
*** Non-Unicode Edition ***
This script performs all the actions that you performed manually in the "Get-
ting Started with ACL Analytics" tutorial.
END
COMMENT Adjusts the table layouts of the three new Analytics tables.
OPEN Trans_Apr_s
DELETE FIELD CARDNUM OK
DEFINE FIELD CARDNUM ASCII 1 16 WIDTH 19
DELETE FIELD CODES OK
DEFINE FIELD CODES ASCII 33 4 WIDTH 7
OPEN Trans1_May_s
DELETE FIELD CODES OK
OPEN Trans2_May_s
DELETE FIELD CODES OK
DEFINE FIELD CODES ASCII 20 4 WIDTH 7
DELETE FIELD AMOUNT OK
DEFINE FIELD AMOUNT PRINT 204 9 2 WIDTH 9
DELETE FIELD DATE OK
DEFINE FIELD DATE DATETIME 24 10 PICTURE "YYYY-MM-DD" WIDTH 27
OPEN Trans_Apr_s
VERIFY FIELDS CARDNUM AMOUNT DATE CODES CUSTNO DESCRIPTION ERRORLIMIT 10
IF WRITE1=0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans_Apr_s
table: 0 data validity errors detected" AT 12 28 )
IF WRITE1>0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans_Apr_s
table: %WRITE1% data validity errors detected" AT 12 28 )
OPEN Trans1_May_s
VERIFY FIELDS CARDNUM CODES DATE CUSTNO DESCRIPTION AMOUNT ERRORLIMIT 10
IF WRITE1=0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans1_May_s
table: 0 data validity errors detected" AT 12 28 )
IF WRITE1>0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans1_May_s
table: %WRITE1% data validity errors detected" AT 12 28 )
OPEN Trans2_May_s
VERIFY FIELDS CARDNUM CODES DATE CUSTNO DESCRIPTION AMOUNT ERRORLIMIT 10
IF WRITE1=0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans2_May_s
table: 0 data validity errors detected" AT 12 28 )
IF WRITE1>0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans2_May_s
table: %WRITE1% data validity errors detected" AT 12 28 )
COMMENT Appends the three new Analytics tables into a single combined table.
APPEND Trans_Apr_s Trans1_May_s Trans2_May_s TO "Trans_All_s" OPEN
DIALOG (DIALOG TITLE "User Dialog" WIDTH 630 HEIGHT 100 ) (BUTTONSET TITLE
"&OK;&Cancel" AT 500 12 DEFAULT 1 ) (TEXT TITLE "The combined transactions
table (Trans_All_s) contains %WRITE1% records" AT 12 28 )
You're finished
That's the end of this brief introduction to scripting. We hope you've seen enough to be convinced of
the value of scripting and that you want to learn more.
Where to next?
You have several options for learning more about scripting in Analytics:
Do this version of the tutorial if you're using the Unicode edition of Analytics.
Do the right version of If you're using the non-Unicode edition, do "Getting started with Analytics (non-
the tutorial Unicode edition)" on page 35.
Tip
To find out which edition of Analytics you're using, on the Analytics main menu, click
Help > About to open the Analytics dialog box. The edition designation appears after
the version number.
Note
The Chinese and Japanese user interfaces are Unicode-only.
Scenario
After you've analyzed the data, you want to present the results of your analysis visually, to
better engage your audience.
Optional section
You're told that from now on, reviewing corporate credit card transactions will be a recurring
responsibility.
To allow yourself, or someone else, to perform future reviews quickly and accurately, you
decide to create a script to automate some of the work.
Planning guidelines
Develop clear, specific objectives
What is the intended end product of your analysis?
You need clearly defined objectives in order to be able to plan how to achieve them. For example, in
this tutorial, your specific objectives are:
l identify the count, and total amount, of corporate credit card transactions in each merchant
category
l identify any transactions in prohibited categories
Map a step-by-step approach
How will you achieve your objectives?
Accomplishing an objective often requires more than one step, so map a detailed, step-by-step
approach to guide you along the way.
For example, two of the steps in the planning for this tutorial could be:
l combine all the individual transaction files into a single file
l group the combined transaction data into merchant categories
Once you've broken down the larger objectives into individual steps, you can consider which
Analytics features and functions to use to perform each step.
Identify what data you'll need
What data do you need to achieve your objectives?
Itemize the required source data to the level of specific data elements or fields. You won't be able to
achieve your desired output without the appropriate input.
In this tutorial you have the main transaction files, but to achieve your second objective you'll also
need a list of prohibited merchant category codes.
Consider technical requirements
Are there any technical considerations you must take into account?
Regardless of which tool you're using for data analysis, you must work within its constraints. Is the
source data stored in a location or a system that the tool can access, and in a format that it can read?
Is the analysis you're proposing supported by the tool?
For example, in order to combine multiple tables in Analytics, the data types of the corresponding
fields in each table must be the same. Analytics supports changing the data type of a field, but that's a
step you need to account for in your planning.
Be prepared to iterate
You may need to adjust your plan as you go along.
In the course of your analysis, you discover something unexpected that warrants further investigation.
You realize you need additional data and additional analytical steps.
Your plan can evolve as your understanding of the data evolves. And it can serve as the basis for a
more mature plan for future analysis of a similar nature.
Import data
You must import data into Analytics before you can analyze it.
We'll familiarize with the import process by using the Data Definition Wizard to import three Excel
worksheets. Importing from Excel is one of the most common methods for acquiring data for analysis
in Analytics. However, Analytics supports importing data from a wide variety of data sources.
Steps
1. Double-click the ACL for Windows shortcut on your desktop.
2. In the ACL for Windows screen, under Open, click Analytic Project.
3. Navigate to C:\Users\user_account_name\Documents\ACL Data\Sample Data Files
and double-click Sample Project.ACL.
Sample Project.ACL opens in Analytics.
If you did not install the Sample Data Files folder in the default location when you installed
Analytics, navigate to the location where you installed it.
l Trans2_May$
5. Make sure Use first row as Field Names is selected, click Next, and then click Finish.
The two Excel worksheets are imported into two separate Analytics tables.
Note
You're making adjustments to a data field in the Data Definition Wizard, during
the import process. You can also make adjustments later, after you have
completed importing the data. You'll see the reason for the adjustments in the
next section of the tutorial.
6. Click Next, in the File name field type Trans_Apr, and click Save.
7. Click Finish, and then click OK.
The third Excel worksheet is imported into an Analytics table.
You should now have three new Analytics tables in the Overview tab of the Navigator. These tables
contain read-only copies of the Excel data. They do not contain the Excel source data itself.
Prepare data
Often you must perform one or more data preparation tasks before data is ready to analyze.
For this tutorial, you'll perform two preparation tasks:
Key point
The time you spend importing and preparing data may exceed the time you spend on the
actual analysis. However, they are critical initial stages, and provide the foundation that your
analysis is built on.
CODES UNICODE
AMOUNT PRINT Enter 2 in the Dec. field to specify that numeric values
display two decimal places.
When you're finished, the May table layouts should look like the layout below.
Note
The date format (YYYY-MM-DD) isn't shown in the layout summary. The
DESCRIPTION field length is different in the two May layouts.
Note
We're verifying the data after updating the data types. When you verify data in
Analytics, you're checking that all the values in a field conform to the requirements of
the field's data type. So it makes sense to verify data only once the data types are
finalized.
Steps
1. Open the Trans_Apr table.
2. From the Analytics main menu, select Data > Verify .
3. In the Verify dialog box, select all the fields in the field list.
Tip
Use Shift+click to select multiple adjacent fields.
4. Click OK.
The result should be: 0 data validity errors detected.
Learn more
Did you notice that Analytics automatically translated the action you performed in the
user interface into the ACLScript VERIFY command? Every command-level action you
perform in the user interface is automatically translated into its corresponding ACLScript
command, and captured and stored in the command log that accompanies each
Analytics project.
This automatic generation of valid, runnable script syntax is one of the most powerful
features in Analytics. We'll be looking at scripting in an optional section at the end of the
tutorial.
5. In the Navigator, double-click the Trans1_May table to open it, and repeat the steps to verify
the data.
6. Do the same for the Trans2_May table.
Both tables should not contain any data validity errors.
Note
If you get an error message stating Maximum error limit reached, check that
you correctly changed the format of the Date field in the table layout to YYYY-
MM-DD.
Learn more
If you want to see what happens when Analytics does identify data validity errors, open
Tables\Badfile and run the verification process.
4. Select Use Output Table so that the output table with the combined data opens automatically
after you run the command.
5. In the To field, type Trans_All and click OK.
Note
Don't worry about the notification. The append command performs some
automatic harmonization of numeric fields, which saves you time and effort.
The new Trans_All table is created, and contains all the records from the three input tables.
The record count in the status bar at the bottom of the Analytics interface should say
Records: 481.
You're now ready to move on to some actual data analysis.
Analyze data
You perform analysis in Analytics by using commands and other tools to gain general insights about
the data you are investigating, and to answer specific questions.
Note
The analysis stage is where the strength of your earlier planning becomes apparent.
If you've formulated clear objectives regarding your investigation, you'll have a clearer
idea of the types of analysis to perform.
3. In the Summarize dialog box, select the following fields and options:
Tab Field or option Select or type
4. Click OK.
The new Trans_All_Grouped table is created. The table contains 110 records, one for each
unique merchant category code in the Trans_All table. The COUNT field tells you how many
source records are in each group.
Tip
Right-click the table view and select Resize All Columns to make the view
more compact.
Where did employees o Right-click the Total AMOUNT header and select Quick Sort Descending
spend the most money?
The Description field shows you that the most money was spent on:
o Caterers
o Eating places and Restaurants
o Hilton International
What were the largest o Right-click the Maximum AMOUNT header and select Quick Sort Descending
single expenditures?
The Description and Maximum AMOUNT fields show you that the largest single
expenditure was a Club Med amount of $1999.06.
Is Club Med an authorized merchant code for the corporate credit card? If the credit
card limit is $2000, was an employee charging an amount just under the limit?
What does an o Right-click the COUNT header and select Quick Sort Ascending
examination of
Five categories had only a single charge each. Are some of them prohibited
infrequently used codes
categories? Perhaps one or more employees thought that misusing a company
reveal?
card only very occasionally would allow them to escape detection.
o Cigar Stores & Stands
o Dating & Escort Svcs.
o Babysitting services
o Amusement Parks
o Civic, Fraternal, and Social Associations
Are any of the categories o Right-click the DESCRIPTION header and select Quick Sort Ascending to
prohibited? alphabetize the field values for easier scanning
o Scan down the field looking for suspicious categories
Perhaps one or more of these categories are prohibited?
o Babysitting services
o Betting (including Lottery Tickets, Casino)
o Civic, Fraternal, and Social Associations
o Dating & Escort Svcs.
o Massage Parlors
o Precious Stones and Metals, Watches and Jewel
o Video Game Arcades/Establishments
Note
Manual scanning is impractical for all but small data sets. We'll look
at a more practical, more reliable method next.
Learn more
Perhaps you just want to perform some quick analysis and you don't want to output the results
to a new table. When you summarized the Trans_All table, instead of selecting File in the
Summarize dialog box, you could select Screen, and output the results to the
Analytics display area.
Outputting to screen is only practical for smaller data sets. However, it has the advantage of
providing an easy way to drill-down on individual groups and see only the source records in
each group.
A general review of the corporate credit card transactions alerted you to some possible prohibited
transactions. You decide to confirm whether any transactions are prohibited by matching a list of
prohibited merchant category codes against the data.
Steps
Note
Make sure you copy the entire string, including all quotation marks.
l Click Edit View Filter to open the Expression Builder, double-click the filter name in the
Filters list, and click OK.
Tip
The Filter history list holds a maximum of 10 filters, so at times you may need
to use the Expression Builder method for reapplying a saved filter.
Learn more
Beyond filters
Filters work well if the number of criteria or conditions contained by the filter are manageable.
The filter you created in this tutorial contains only 9 codes. But what if your list of prohibited
merchant category codes was several dozen, or more?
A more efficient approach would be to join an Analytics table containing the prohibited codes
with the transactions table. Every match in the joined output table would be a prohibited
transaction.
Joins are beyond the scope of this tutorial, but they are a frequently used feature in Analytics.
Report results
Once your data analysis is complete, Analytics gives you several different ways to report or present
your results.
Traditional reports with columns of data are available, but we'll look at conveying results using the
more engaging data visualization described below.
Treemap visualization
This treemap visualization shows the grouped credit card transactions you output in the Trans_All_
Grouped table. The relation between groups is conveyed in two different ways:
l size of the box – indicates the count of individual transactions in each group
The larger the box, the greater the number of transactions. The boxes are arranged in size
from top left to bottom right.
l color intensity of the box – indicates the total amount of each group
The darker the box, the greater the total amount.
So, for example, the size of the Club Med box, in the bottom right quadrant, indicates only a small
number of transactions, but the color indicates that the total transaction amount is significant.
Steps
1. Go to Launchpad (www.highbond.com).
2. Enter your HighBond account credentials (e-mail and password) and click Sign In.
Launchpad opens.
3. Click Results .
The Results homepage opens.
Note
If you cannot access Results, you may not be assigned an appropriate
subscription type or Results role. Use one of the alternative report creation
methods listed in "Other reporting methods in Analytics" on page 91.
If you would like to access Results, contact your company’s Analytics account
administrator.
Create a Collection
Steps
1. From the Results homepage, click New Collection.
2. On the New Collection page, in the Name field, enter or copy ACL Tutorial.
3. At the bottom of the page, click Create Collection.
The Collection settings page opens.
Create an Analysis
Steps
1. At the bottom of the Collection settings page, under What's Next?, click create your first Data
Analysis .
The Analysis Details page opens.
2. On the Analysis Details page, in the Name field, enter or copy Sample Report.
3. Click Create Analysis .
The new ACL Tutorial Collection opens with the empty Sample Report Analysis that you just
created.
Note
Leave Results open. You will be coming back to create the data visualization.
4. Click To, and in the Select Destination Test dialog box navigate to the Sample Report
Analysis container you just created and double-click to open it.
5. In the New data analytic field enter or copy Trans_All_Grouped and click Create.
You are returned to the Export dialog box and an ID number and data center code are prefilled
in the To text box.
6. Click OK.
The data in the Trans_All_Grouped table is exported to Results.
4. In the Configure Visualization panel, select the fields and options shown below.
Note
5. Click Apply .
The Treemap visualization is generated.
You can hover your mouse over the individual boxes in the treemap to see the embedded data.
If you change the size of the browser window, the treemap dynamically updates by
repositioning boxes, and by displaying and suppressing a different selection of associated
descriptions.
Steps
1. Click Untitled at the top left corner of the Treemap visualization and type a title for the visualiz-
ation such as Transaction Treemap and press Enter.
2. Click Save > Save As .
3. In the Title field, type a name for the interpretation such as Tutorial visualizations and click
Save.
The interpretation and the visualization are both saved and can be reopened later.
4. Click the name of the collection, ACL Tutorial, in title bar to return to the Sample Report
Analysis container.
5. Click View Interpretations . The Interpretations dialog box appears and notice that it lists the
newly created interpretation, Tutorial visualizations.
You can create multiple visualizations and interpretations in each Analysis container. Each
visualization is based on the data in the Table View.
Data visualizations in the Analysis The data visualization capability in Results is also available locally in
App window the Analysis App window, a freestanding component of Analytics.
Note
Some of the charts and visualizations available in Results
may not be available in the Analysis App window until a
new version of Analytics is released.
Legacy Analytics charts Analytics contains a legacy charting and graphing capability that allows
you to create basic visual reports.
For more information, see "Working with Analytics graphs" on
page 1330.
Traditional columnar reports In some cases, a traditional text- and number-based report with rows
and columns of data is all you need.
For more information, see "Formatting and generating Analytics reports"
on page 1324.
Third-party reporting tool You can use a third-party reporting tool such as Tableau or Microsoft BI
and import data directly from Analytics.
For more information, see "Connecting to Analytics from a third-party
reporting application" on page 1342.
Exporting data to Excel or CSV You can export data to Excel, or to a comma-separated file, and use the
reporting capabilities of Excel, or of any tool that can work with a CSV
file.
For more information, see "Exporting data" on page 203.
You're finished
Congratulations! You've completed your end-to-end introduction to analyzing data using Analytics.
Where to next?
You have several options for continuing to learn about Analytics:
Academy offers a range of courses for various experience levels. ACL Analytics
Foundations Program is a series of six mini-courses that teaches Analytics basics for new
users.
Academy is the Galvanize online training resource center. Go to the course catalog to see
the available courses.
Academy Academy courses are included at no extra cost for any user with a subscription.
You're currently in the Analytics and ACLScript Help. The Help provides reference-style
conceptual material, step-by-step instructions, and ACLScript syntax for all aspects of
Analytics.
For example, here are the Help topics for the append operation, which formed part of the
tutorial you just completed:
o "Appending tables" on page 859 (conceptual)
Analytics and o "Append tables" on page 869 (step-by-step instructions)
ACLScript Help o "APPEND command" on page 1676 (ACLScript syntax)
You can gain a lot of value using Analytics in an ad hoc or manual fashion without ever writing a script.
For the most part, anything that can be done in a script can be done in the user interface, and vice
versa. However, to gain the most value, power, and efficiency from Analytics, you need to script.
The good news is that Analytics provides tools to make scripting relatively easy, even for a novice.
Save time
The basic review process is standardized. With each review cycle, you can spend time
repeating the basic process manually, or you can save time by automating the process.
What is a script?
An Analytics script is a series of ACLScript commands that perform a particular task, or several
related tasks. For example, everything that you just did manually in the first part of this tutorial could
also be performed using a script.
ACLScript is the command language that forms the basis of Analytics. Scripts are stored in
Analytics projects. Individual scripts appear in the Navigator, and are prefaced by the script icon .
2. In the log, locate and click the SUMMARIZE command that outputs results to a new table.
The command prefills the Command Line near the top of the Analytics interface, just below the
toolbar.
Note
If the Command Line isn't visible, select Window > Command Line from the
Analytics main menu.
Note
We're going to skip over some scripting best practices in order to keep this
introduction to scripting brief. The goal is to demonstrate how easy it is for even new
users to create scripts in Analytics.
Steps
1. In the log, locate and select the following commands:
You're adding _2 to avoid name conflicts with the table and filter you already created manually.
Note
You can also run a script by right-clicking it in the Navigator and selecting Run.
A script does not have to be open to be run.
Note
The script assumes that the Sample Data Files folder is installed in the default
location. If the folder is installed in a different location, you need to modify the
navigation paths in the script to point to the correct location.
The tables created by the script are appended with _s so that they don't overwrite the
tables you created manually.
Steps
Note
It's important that you select the entire script and don't miss any lines.
Alternately, you can download a text file with the script here: Getting started
tutorial (Unicode edition)
3. Click in the Script Editor window and press Ctrl+V to paste the script syntax into the empty
Getting_Started_tutorial script.
2. Click Save the Open Project , and click Yes in the prompt that appears.
If you do not find the save icon, select Window > Toolbar in the Analytics main menu to enable
the toolbar.
Note
If you haven't worked with scripts before, the script syntax may look overwhelming at
first. Keep in mind that almost all the syntax was simply copied from the Analytics log.
The syntax for the interactive notifications in the script (DIALOG commands) was
auto-generated by another relatively simple Analytics tool.
The green COMMENT commands walk you through the script at a high level. You'll
recognize the tasks that you just completed in the preceding tutorial.
COMMENT
*** Unicode Edition ***
This script performs all the actions that you performed manually in the "Get-
ting Started with ACL Analytics" tutorial.
END
COMMENT Adjusts the table layouts of the three new Analytics tables.
OPEN Trans_Apr_s
DELETE FIELD CARDNUM OK
DEFINE FIELD CARDNUM UNICODE 1 32 WIDTH 35
DELETE FIELD CODES OK
DEFINE FIELD CODES UNICODE 65 8 WIDTH 11
OPEN Trans1_May_s
DELETE FIELD CODES OK
OPEN Trans2_May_s
DELETE FIELD CODES OK
DEFINE FIELD CODES UNICODE 39 8 WIDTH 11
DELETE FIELD AMOUNT OK
DEFINE FIELD AMOUNT PRINT 407 18 2 WIDTH 9
DELETE FIELD DATE OK
DEFINE FIELD DATE DATETIME 47 20 PICTURE "YYYY-MM-DD" WIDTH 27
OPEN Trans_Apr_s
VERIFY FIELDS CARDNUM AMOUNT DATE CODES CUSTNO DESCRIPTION ERRORLIMIT 10
IF WRITE1=0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans_Apr_s
table: 0 data validity errors detected" AT 12 28 )
IF WRITE1>0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans_Apr_s
table: %WRITE1% data validity errors detected" AT 12 28 )
OPEN Trans1_May_s
VERIFY FIELDS CARDNUM CODES DATE CUSTNO DESCRIPTION AMOUNT ERRORLIMIT 10
IF WRITE1=0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans1_May_s
table: 0 data validity errors detected" AT 12 28 )
IF WRITE1>0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans1_May_s
table: %WRITE1% data validity errors detected" AT 12 28 )
OPEN Trans2_May_s
VERIFY FIELDS CARDNUM CODES DATE CUSTNO DESCRIPTION AMOUNT ERRORLIMIT 10
IF WRITE1=0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans2_May_s
table: 0 data validity errors detected" AT 12 28 )
IF WRITE1>0 DIALOG (DIALOG TITLE "User Dialog" WIDTH 490 HEIGHT 100 )
(BUTTONSET TITLE "&OK;&Cancel" AT 360 12 DEFAULT 1 ) (TEXT TITLE "Trans2_May_s
table: %WRITE1% data validity errors detected" AT 12 28 )
COMMENT Appends the three new Analytics tables into a single combined table.
APPEND Trans_Apr_s Trans1_May_s Trans2_May_s TO "Trans_All_s" OPEN
DIALOG (DIALOG TITLE "User Dialog" WIDTH 630 HEIGHT 100 ) (BUTTONSET TITLE
"&OK;&Cancel" AT 500 12 DEFAULT 1 ) (TEXT TITLE "The combined transactions
table (Trans_All_s) contains %WRITE1% records" AT 12 28 )
You're finished
That's the end of this brief introduction to scripting. We hope you've seen enough to be convinced of
the value of scripting and that you want to learn more.
Where to next?
You have several options for learning more about scripting in Analytics:
Context-sensitive Help
Press F1 from any location in Analytics, or click the Help button from most locations, to open
a Help topic that explains the currently active window, dialog box, tab, or wizard screen.
From this initial Help topic you can often click links to access additional, more detailed information in
the online Help.
Community
Go to Community, a web-based platform with a variety of customer resources, including a customer
forum where you can post questions about Analytics features and functionality.
Support
(Account sign-in required)
From the Analytics main menu, select Help > Contact Galvanize to open a web browser and
connect to Support.
Open Analytics
To open Analytics, double-click the ACL for Windows desktop shortcut, then click an option in ACL for
Windows:
l New Analytic Project – create a new, empty Analytics project
l Open Analytic Project – open an existing Analytics project
l Open Analysis App – open an existing analysis app
l Under Recent Analytics Files , or Sample Files – open a recently opened or a sample
Analytics project (.acl) or analysis app (.aclx)
If you create or open an Analytics project it opens in Analytics. If you open an analysis app, it opens in
the Analysis App window, a free-standing component of Analytics.
Close Analytics
To close Analytics, select File > Exit.
If any unsaved changes are detected in your project, you are prompted to save them before you exit.
Click Yes in the confirmation dialog box to save your changes and exit.
l Dialog Builder
Understanding the organization and function of the various user interface elements helps you work
effectively with Analytics.
Title Bar Displays the name of the open Analytics project, and the ACL for Windows
1 component name.
Main Menu Provides access to most Analytics features, including menu commands for:
o Working with Analytics projects
o Performing data analysis
2 o Configuring options and connection settings
Toolbar Buttons in the toolbar are shortcuts to common actions. Analytics enables
buttons that are relevant to your current activity.
3 To display or to hide the toolbar, select Window > Tool bar.
Note
You can customize the buttons contained in the toolbar. For
more information, see "Customize the Analytics toolbar" on
page 152.
Navigator Displays information in three tabs about the open Analytics project:
o Overview tab – displays all items that belong to the project
You can right-click any project item to perform an action. To organize items
in the Overview, right-click the project icon and select New > Folder. You
can drag any project item into the folders that you create in the Overview.
o Log tab – displays the Analytics command log
All actions you take associated with the project are recorded and organized
chronologically in the log. Double-click log entries to open them, and right-
click log entries to perform an action.
o Variables tab – displays the names, values, and data categories of any
variables in the project
The contents of the tab are dynamically updated as variables are created,
deleted, or changed in value. Variable names are listed alphabetically.
Tip
To resize the Navigator, drag the divider between the Navigator
and the display area. You can also double-click the divider to
4 close or open the Navigator.
Filter and Quick A text box and drop-down list that allow you to perform two different tasks:
Search o Apply a filter to the data in the View tab
o Enter one or more search terms to perform a quick search of the data in the
6 View tab
Index Allows you to apply existing indexes to the table and to see if an index is
7 currently applied
Tip
To resize the display area, drag the divider between the display
area and the Navigator. You can also double-click the divider to
close or open the Navigator.
Note
The options in the dialog boxes vary somewhat depending on the operation you
select. There are a number of standard options, explained below, that appear for
most operations.
Options that are not standard are explained elsewhere in the Analytics and ACLScript
Help.
field list or lists Specify the input field or fields for the operation
Subtotal Fields Specify one or more numeric fields to optionally subtotal as part of the operation
Presort Specify that the input field is automatically sorted before the operation is performed
Specify an IF statement that excludes records that do not meet the specified condition
If from the operation
Specify the name and location of the Analytics table that will contain the output results
Note
To
Appears on the Output tab (as Name) in command dialog boxes that have
(not shown) an Output tab
Specify that an Analytics table containing output results opens automatically upon
completion of the operation
Use Output Table Appears on either the Main tab or the More tab.
Append To Specify that output results contained in an Analytics table or text file are added to the
Existing File bottom of an existing Analytics table or text file
Specify an Analytics table or a text file when you save output results to a file
Depending on the operation, you may be able to save to either a table or a text file, or to
File Type only one of these options
Specify the name and location of the Analytics table or text file that will contain the output
results
Appears on the Main tab (as To) in command dialog boxes that do not have an Output
Name tab
Specify whether to save an Analytics table with output results locally or to the server (only
enabled when connected to a server table)
Local Appears on either the Main tab or the Output tab
Customizing Analytics
Analytics is installed with a standard set of configuration settings that define the default behavior of
the application. You can change any of these settings in the Options dialog box (Tools > Options ) to
modify application behavior, including:
l turning features on or off
l changing how data is displayed
l controlling some aspects of command output
Script Editor
For information about customizing the Script Editor, see "Customizing the Script Editor" on
page 1611.
Caution
Clicking Factory sets all options on all Options tabs to their default settings, not just
the options on the active tab.
Steps
1. From the Analytics main menu, select Tools > Options .
2. Click the tab with the option that you want to change and modify the setting.
The following tabs are used to group related options:
l System tab
l Interface tab
l Table tab
l View tab
l Command tab
l Numeric tab
l Print tab
System options
Use the option in the System tab to control how memory is used for sorting and indexing operations.
Additional information
l If Analytics is unable to determine the number of records in a file being sorted or indexed, a
memory mapped file is not used for the operation, even if Use Additional System Resources
for Sorting and Indexing is selected.
l Using additional system resources for sorting and indexing may slow down other tasks while
sorting or indexing is in progress.
l Unlike all other options in the Options dialog box, the setting for this option is not stored in the
Analytics preferences file. The option applies only to the computer on which it is set.
For more information about options stored in the preferences file, see "How Analytics
preferences files work" on page 146.
Interface options
Use the options in the Interface tab to specify some of the basic behavior of Analytics.
Note
This option controls script syntax checking only. It does not control analytic header
validation, which is a separate process that cannot be disabled.
Table options
Use the options in the Table tab to specify how Analytics processes tables.
Note
Tables with numeric fields will open more slowly with this option on.
Caution
Be careful when turning this option on. It may be an original data file that is deleted
along with the table.
Data files are deleted outright. They are not sent to the Windows Recycle Bin.
You can also use the SET DELETE_FILE command in a script or on the command line to turn this option
on or off.
If you change the setting using the Delete Data File with Table checkbox, the change remains in
effect until you specifically change it again. If you use the SET DELETE_FILE command to change the
setting, the change remains in effect for the duration of the Analytics session only.
Changes to this setting are recorded in the log using the following syntax:
SET DELETE_FILE {ON|OFF}
If you deselect this option, a single table layout can be shared by multiple data files or data sources
with an identical record structure. The feature works with only those Analytics operations that can
output results to an Analytics table with an identical record structure – extracting, sorting, sampling,
and merging – and with copying table layouts.
When sharing of table layouts is permitted, multiple source data files (e.g., Analytics data files (.fil)) or
data sources that have the same record structure share a single set of field definitions. When you add
a physical or computed field to a shared table layout, add a column to an associated view, or add a
view, the added field, column, or view is automatically added to all the Analytics tables that use the
shared table layout. When you delete a field, column, or view, it is no longer available to any of the
Analytics tables that use the shared table layout.
Generally, you should maintain a separate table layout for each data file. However, sharing a single
table layout can save labor if multiple data files with the same record structure require an identical set
of field definitions, and any updates to the table layout will apply to all the data files. For example,
extracting records from an annual transactions table into twelve separate monthly tables produces a
total of thirteen tables with the same record structure. If the Don’t Share Table Layouts checkbox is
selected, each table has its own layout. If the Don’t Share Table Layouts checkbox is deselected, all
the tables share the original table’s layout and the layout can be managed centrally.
Deleting a shared table layout from one of the tables that uses it does not perform a global deletion.
The shared table layout is still available to the other tables that use it.
Sharing does not extend beyond individual Analytics projects. If you copy a table to another project, a
new table layout is created, regardless of how Don’t Share Table Layouts is set.
Note
Blank spaces are treated like characters.
If the option is on
If the option is on, comparison strings must be exactly identical to be a match. When comparing two
strings of unequal length, Analytics pads the shorter string with trailing blank spaces to match the
length of the longer string.
Applicability
Some Analytics operations and functions are affected by the Exact Character Comparisons option
and some are not:
Log entry
Changes to this setting are recorded in the log using the following syntax:
SET EXACT {ON|OFF}
Buffer Size
This option specifies the size of the data block read. The default is 33K (kilobytes), which is the
recommended buffer size for most applications.
Acceptable values range from 5 to 255. Changing the buffer size may provide small performance
improvements in certain environments. You should only change this setting if you are advised to do
so by Support.
Changes to this setting are recorded in the log using the following syntax:
SET READAHEAD value
Sort Memory
This option specifies the maximum amount of system resources to be allocated for sorting and
indexing processes. The sort memory can be any value from 0 to 2000MB (megabytes), in 20MB
increments. To optimize Sort performance, set the sort memory according to the available physical
memory in the system. This enables Analytics to use the necessary amount of memory to sort a table
up to this maximum, if required.
If the sort memory is left as 0, Analytics uses the system resources currently available.
Sort Order
This option sets the sort sequence for character fields.
Choose the locale from the drop-down list. The default is “System Default” for the non-Unicode
edition of Analytics and “Mix Languages (UCA)” for the Unicode edition. By default, Analytics sorts
data in ascending order based on the byte order of each character in its character set. The Sort
Order option affects sort order when sorting or indexing, or performing a quick sort, and when testing
sequential order.
Changes to this setting are recorded in the log using the following syntax:
SET ORDER values
View options
Use the options in the View tab to specify how Analytics displays views.
Changes to this setting are recorded in the log using the following syntax:
SET CLEAN {ON|OFF}
Redraw Seconds
This option displays the maximum amount of time in seconds that Analytics takes to redraw the view.
If redrawing takes longer than the specified amount of time, Analytics interrupts processing and
displays a message. The maximum you can specify is 100 seconds. The default time is 10 seconds.
You may need to increase the redraw time when using restrictive filters that select very few records,
especially when working with very large files. When you increase the redraw time, you may have to
wait longer to view the data. To reduce the waiting time, turn off the Hide Filtered Records option.
Command options
Use the options in the Command tab to specify how Analytics executes commands.
Autoexecute Commands
If you turn this option on, Analytics immediately executes certain commands using the selected field in
the view as input. You cannot edit the command or apply a local filter. The option applies only to some
command, and the selected input field must be the data type required by the command.
Intervals
This option indicates the number of intervals chosen by default for a stratification or histogram. Enter
a number from 1 to 255. The default is 10.
Error Limit
This option sets the default number of errors after which Analytics stops processing the Sequence or
Verify commands. Enter a number from 1 to 255. The default is 10.
Specifies the number of times Analytics attempts to import or export data if the initial attempt is
unsuccessful. Enter a number from 0 to 255. If you enter 0, no additional attempts are made after an
initial failure. The default is 0.
There is no waiting period between retry attempts. Each successive attempt is made immediately
after a preceding failure.
The ability to specify retry attempts is useful when connecting to databases or cloud data services,
which can be temporarily unavailable.
Changes to this setting are recorded in the log using the following syntax:
SET RETRY num
Applies to the following commands:
Import o ACCESSDATA
o IMPORT GRCPROJECT
o IMPORT GRCRESULTS
o REFRESH
(for tables initially created using ACCESSDATA or IMPORT SAP only)
Maximum Categories
This option specifies the maximum number of unique values that can occur in a character key field
used as input for the Train command. Enter a number from 1 to 255.
Notify Settings
Retry Attempts
This option specifies the number of times the Notify operation will attempt to send an email if the initial
attempt is unsuccessful. Enter a number from 0 to 255. If you enter 0, no additional attempts are
made after an initial failure. The default is 5.
One possible reason for the Notify operation failing to send an email is that the email server is
unavailable.
Date Settings
Day, Month, Year
Use the Day , Month, and Year text boxes to specify the characters that represent these components
of date and datetime formats. The default values are ‘D’ for Day , ‘M’ for Month, and ‘Y’ for Year, but
you can specify different characters for languages other than English. The characters you specify
must be uppercase, they must all be different, and ‘D’, ‘M’, and ‘Y’ can only be used in their default
positions.
Note
This option has no effect on how Analytics reads dates from data sources. To specify
how Analytics reads dates, use the Data Definition Wizard, or the Format field in the
Table Layout dialog box. For more information, see "Formats of date and time source
data" on page 347.
DD/MM/YY 31/12/14
DD/MM/YYYY 31/12/2014
MM/DD/YY 12/31/14
MM/DD/YYYY 12/31/2014
YYYYDDD 2014365
YYYY-MM-DD 2014-12-31
DD Day (1 – 31)
MM Month (1 – 12)
Note
If you specify a date display format that does not display all the available source data,
quick filtering by date or datetime is disabled. For example, if you specify the format
MMM YYYY for dates that have day, month, and year data, quick filtering on a date or
datetime value in a view returns zero results.
Changes to the date display format are recorded in the log using the following syntax:
SET DATE value
Start of Century
Many data files use only two digits to represent the year, which means the century in which the year
occurs is unspecified. The two-digit year denoting the earliest year assigned to the 20th century can
vary from one set of data files to the next. This year is often called the start-of-century year or the pivot
year.
The pivot year applies to two-digit years only, and does not affect data that uses four digits to
represent the year. Analytics can read four-digit years from 1900 to 9999.
The default Start of Century setting is 40. With this setting, Analytics interprets two-digit years 40 to
99 as 1940 to 1999, and two-digit years 00 to 39 as 2000 to 2039.
To change the pivot year, enter a number from 0 to 99. For example, if you want to set 1950 as the
pivot year, enter 50 in the Start of Century text box. The table below provides examples of different
pivot years.
00 00 to 99 1900 to 1999
When working with data files that use a different pivot year from the Start of Century year, you can
use an expression to create a computed field that correctly interprets the two-digit year or converts it
to a four-digit year.
Changes to the Start of Century setting are recorded in the log using the following syntax:
SET CENTURY value
Aging Periods
This option sets the default aging periods for the Age dialog box. If you use a specific set of aging
periods frequently, you can enter the set in the Aging Periods text box and Analytics uses the setting
as the default aging periods in the Age dialog box. If necessary, you can still override the periods in
the Age dialog box.
Enter the periods in days, separated by commas without spaces. You can set as many aging periods
as you want.
Changes to this setting are recorded in the log using the following syntax:
SET PERIODS values
Time Settings
Hour, Minute, Second
Use the Hour, Minute, and Second text boxes to specify the characters that represent these
components of time and datetime formats. The default values are ‘h’ for Hour, ‘m’ for Minute, and ‘s’
for Second, but you can specify different characters for languages other than English. The characters
you specify must be lowercase, they must all be different, and ‘h’, ‘m’, and ‘s’ can only be used in their
default positions.
Note
This option has no effect on how Analytics reads times from data sources. To specify
how Analytics reads times, use the Data Definition Wizard, or the Format field in the
Table Layout dialog box. For more information, see "Formats of date and time source
data" on page 347.
Note
If you specify a time display format that does not display all the available source data,
quick filtering by datetime or time is disabled. For example, if you specify the format
hh:mm for times that have hour, minute, and seconds data, quick filtering on a
datetime or time value in a view returns zero results.
Changes to the time display format are recorded in the log using the following syntax:
SET TIME value
Conversion of local time to UTC is for display purposes only, and does not affect the source data,
which continues to contain the UTC offset. You can change back and forth between the two different
display modes whenever you want to.
When Analytics performs calculations on local time data with a UTC offset, the UTC offset is automat-
ically incorporated and the calculation is performed on the UTC equivalent of the local time. If Display
Times with UTC Offset as UTC is checked, you see the actual time data that is being used in a
calculation, which can make the results easier to understand. For more information, see "How UTC
offsets affect datetime expressions" on page 830.
About UTC
UTC is a global time standard that has replaced Greenwich Mean Time (GMT). For most purposes,
the two standards are equivalent. The final portion of UTC-based time data (for example, -05:00, or
+01:00) is a UTC offset that indicates how far behind or ahead the local time value is compared to
UTC. For example:
l 31/12/2014 10:30:15-05:00 represents December 31, 2014, 10:30:15 AM, Eastern Standard
Time (North America).
l 31/12/2014 15:30:15 (UTC) represents the same point in time at zero degrees longitude.
For UTC-based datetime data, if conversion to UTC goes forward or backward across the boundary
of midnight, the date is adjusted by one day.
Note
The UTC offset is also referred to as the time zone offset, although the two are not
exactly the same. More than one time zone can have the same UTC offset.
‘Display Times with UTC Offset as ‘Display Times with UTC Offset as
Source data UTC’ selected (default setting) UTC’ not selected
‘Display Times with UTC Offset as ‘Display Times with UTC Offset as
Source data UTC’ selected (default setting) UTC’ not selected
Numeric options
Use the options in the Numeric tab to specify how Analytics processes and displays numeric data.
Verify Data
If you turn this option on, every time you process a field while a table is open, Analytics automatically
checks whether the contents of a data field correspond to the field’s data type in the table layout
(Character, Numeric, Datetime, Logical, etc.). Processing stops when an error occurs, unless the
Blank Invalid Data option is also on.
If you turn this option off, Analytics does not test for data validity, therefore improving processing
speed.
Changes to this setting are recorded in the log using the following syntax:
SET VERIFY {ON|OFF}
Changes to this setting are recorded in the log using the following syntax, where BLANK indicates
that the option is selected and ON means that the Verify Data option is selected, but Blank Invalid
Data is not:
SET VERIFY (BLANK|ON}
Thousands Separator
Analytics uses a comma as the default thousands separator for numeric output. You can change the
default setting to either a period or a space by entering the new character in the text box. The
thousands separator cannot be the same as the decimal separator.
List Separator
Analytics uses a comma as the default list separator, which is used primarily to separate function
parameters. You can change the default setting to either a semi-colon (;) or a space by entering the
new character in the text box. The list separator cannot be the same as the decimal separator.
Print options
Use the options in the Print tab to specify the default print settings for reports and margin settings for
printed output.
Margins
The Left Margin, Top Margin, Right Margin, and Bottom Margin text boxes allow you to specify the
margins for all printed output. To change a value, enter the new value in the text box, or click the up
and down arrows beside the text box to increase or decrease the value.
If you specify a margin that exceeds your printer’s printable area, Analytics uses the maximum of your
printer’s printable area as the margin.
Changes to each of the individual margin settings are recorded in the log using the following syntax:
SET MARGIN {LEFT|RIGHT|TOP|BOTTOM} value
Fixed-width Font
Analytics uses fixed-width fonts for information displayed in the Table Layout, Script, and
Workspace windows. The default fixed-width font is Courier New. You can choose another font from
the list box.
Proportional Font
Analytics uses proportional fonts in views and reports, and to display information such as the project
file name, the table, and the record count in the status bar. The default proportional font is Arial. You
can choose another font from the list box.
Language Version
Analytics allows letters, numbers, and the underscore character to be used in field names. The
default Standard language version setting accommodates Western European characters for field
names. The Thai setting allows Thai characters to be used in addition to English.
The settings for the configurable options in Analytics – that is, the Options dialog box settings – are
stored in a preferences file (.prf file) called aclwin15.prf (non-Unicode edition) or acl15.prf
(Unicode edition).
Any changes you make in the Options dialog box are automatically saved to the .prf file. The changes
remain in effect unless you specifically change them again.
Note
The application data folder may be hidden by the Windows operating system. If
required, enable the Windows folder option to show hidden files and folders.
The .prf file in the application data folder contains the global preference settings for Analytics. Any
changes you make in the Options dialog box are saved to this global .prf file, unless you are using a
project-specific .prf file.
The global .prf file is used when:
l you open Analytics without opening an Analytics project
l you open a project that does not have a project-specific .prf file
l you close a project without closing Analytics.
Caution
If you copy the global .prf file, be careful not to inadvertently move the file rather than
copy it. If you move the file, any global preference settings you have created will be
lost, and replaced by the default configuration settings.
Note
If you have different versions of Analytics installed side-by-side, make sure to copy
the correct version of the .prf file.
The Analytics project file with the .acl extension and the project-specific .prf file must be in the same
folder for the association between the two to take effect. When the project is open, the preference
settings specified in the project-specific .prf file are used. Any changes you make in the Options
dialog box are saved to the project-specific .prf file rather than the global .prf file.
Note
You must use the Plus, Minus, and 0 keys on the number pad, not on the main
keyboard. On laptops, press Fn+Ctrl+ the appropriate key on the number pad.
Note
Once you click Close the changes are saved and Reset does not reverse them.
You can revert to the default toolbar settings by selecting Tools > Options >
Factory .
Note
If you want other Analytics users to have the custom menu items, give them the *.mnu
file with instructions about where to locate the file.
Sub-menu entries
Each *.mnu file creates a separate sub-menu entry under the Applications menu. For example,
Account scripts.mnu creates the Account scripts sub-menu entry and this menu structure:
Applications > Account scripts .
Sub-menu entries appear in alphanumeric order on the Applications menu.
Tip
Users can become disoriented by too many sub-menu levels. A best practice is to limit
sub-menu levels to three.
Tip
Create or edit your menu files in a text editor such as Notepad++ with all non-printing
characters displayed so that you can see exactly what characters are contained in the
file.
Use a monospaced or fixed-width font so that individual characters align vertically.
A sample menu file, Template.mnu, is included in the Sample Data Files folder installed with
Analytics.
l Template.mnu creates the sub-menu entry Template in the Applications menu in Sample
Project.acl, and in the three other sample Analytics projects contained in the Sample Data
Files folder.
l The Template sub-menu entry contains six custom menu items at the first level.
l One of the first-level custom menu items, Margin Analysis , contains four custom menu items
at the second level.
l Most of the custom menu items in Template.mnu are simply placeholders to illustrate the
concept of menu files.
Content of Template.mnu
MAIN MENU 6 .
Margins Analysis 8 menu_def .
Inventory Analysis PAUSE 'SAMPLE INVENTORY ANALYSIS BATCH' .
Accounts Payable Analysis PAUSE 'LAUNCH YOUR A/P BATCH(ES)' .
Accounts Receivable Analysis PAUSE 'DO A/R BATCH(ES) HERE' .
Internal Audit Functions PAUSE 'SAMPLE INTERNAL AUDIT PROCESSES' .
Quit ACL QUIT .
.
MARGINS ANALYSIS 4 .
Exception Listing PAUSE 'DO Batch where margin<=0' .
High Margin Products PAUSE 'Sample Batch top 5 margins' .
Low Margin Products PAUSE 'Calculate lowest 5 margins' .
Margin Statistics STATISTICS .
Each line of the menu file must be exactly the same length.
Although not required, it is good practice to use a period (.) to visually mark the end of
Line length each line, immediately before the line break.
The lines in the menu file are counted from zero (0).
Keep this numbering scheme in mind whenever you specify line number references in
the menu file syntax. If the text editor you are using displays and counts line numbers
beginning at 1, you need to decrement the text editor line number by 1 when you specify
menu file line number references.
In the example above, the Margins Analysis menu item appears on line 1, and the
Line numbering MARGINS ANALYSIS sub-menu syntax block appears on lines 8 through 12.
Blank lines can appear between syntax blocks but not within syntax blocks.
Blank lines, composed of space characters, must be the same length as the other lines in
the menu file.
Although not required, one or more blank lines between syntax blocks provides visual
Blank lines separation in the menu file.
Syntax blocks define each group of custom menu items. You can use multiple syntax
blocks to create multiple menu levels.
o The left side of the block contains the names of the menu items, one per line. These
are the names that appear on the menu in Analytics.
Syntax blocks o Names can be a maximum of 35 characters.
Property Requirement
o The right side of the block contains either an ACLScript command or a line reference
to a lower-level block of syntax.
o Lines on the right side of the block must all start at character position 37.
o Use only space characters to align text elements. Do not use tab characters.
Note
Even one tab character in a menu file will cause the file to be ignored.
Use a text editor that can display tab characters so you can check for
their presence.
A reference from a menu item to a lower-level block of syntax takes the form num menu_
def. num specifies the line number on which the lower-level block of syntax starts – that is,
the heading line of the lower-level syntax block.
Reference to a
lower-level block of In the example above, line 1 contains the Margins Analysis menu item, which refers to the
syntax line on which the MARGINS ANALYSIS lower-level syntax block starts ( 8 menu_def ).
Note
The script must be included in the Analytics project in which the custom
menu item appears.
Short commands can be entered directly in the .mnu file. Longer commands with multiple
Custom menu parameters should be saved in a script, which can be referenced using the DO SCRIPT
items command.
Keep the following points in mind when you edit a menu file that is already in use:
l Before you edit any menu file, make a backup copy of it.
l If you add or remove lines, make sure to adjust any line number references appropriately.
l Wherever possible, add new items at the end of the menu file in order to maintain the existing
line references.
1. Copy Template.mnu from the Analytics Sample Data Files folder to a working location.
Caution
Do not edit the original template file. If you run into problems you can recopy the
original file and start again.
Note
If you are creating a menu file from scratch, change the file extension to .mnu.
3. Open the renamed file in a text editor such as Notepad++ and edit it to build sub-menus and
custom menu items.
Follow the "Menu file syntax requirements" on page 155 above exactly.
4. Do one of the following:
l Save the file in the folder containing the Analytics project in which you want the custom menu
items to appear.
l Save the file in the Analytics executable folder to make the custom menu items available in
Tip
You can create both project-level and global menu files, if required.
Analytics includes a language called ACLScript that is used throughout the application to process
commands and record analysis steps. For example, when you select Analyze > Count from the main
menu and complete the required information in the Count dialog box, Analytics automatically
converts the information you entered to a command statement that is used to run the command and
record the action in the log.
Steps
1. If the Command Line text box is not visible, select Window > Command Line.
2. Enter the command text using one of the following methods:
l Type in the command using the required syntax.
l Click an entry in the Log tab in the Navigator to add the command to the command line. You
can run the command as is, or edit it before running the command.
Copy the command syntax from an existing Analytics script, or other text file, and paste it in
l
the Command Line text box. You can run the command as is, or edit it before running the
command.
3. Optional. If the command has a dialog box associated with it in the Analytics user interface,
click Edit Command to display the associated dialog box, which you can use to modify the
parameter settings for the command.
4. Click Run or press Enter to run the command.
The Run, Clear Entry , and Edit Command options are also available by right-clicking in the
Command Line text box.
Note
The email notification feature can be used with any mail server that supports SMTP
(Simple Mail Transfer Protocol), which is used by Microsoft Exchange and many
other mail servers. Email notification uses port 25, so this port must be open on the
mail server or the command fails.
If email notification fails with an error message, contact your IT department to find out
if port 25 is blocked on your network.
l Mailbox Path – Enter the hostname or IP address of your SMTP mail server.
If you are using a local mail system, enter the path to a local mailbox or click Browse to open
the Browse for Folder dialog box.
l To – Enter the email addresses of recipients. Separate the names and addresses of multiple
recipients with commas.
Note
Enter a maximum of 1020 characters.
l Cc – Optional. Enter the email addresses of “carbon copy” recipients. Separate the names
and addresses of multiple recipients with commas.
Note
Enter a maximum of 1000 characters.
l Bcc – Optional. Enter the email addresses of “blind carbon copy” recipients. Separate the
names and addresses of multiple recipients with commas.
l Subject – Enter the text of the subject line.
l Text – Enter the text of the message.
l Attachment – If you want to include an attachment, specify the path and filename of the file,
or click Browse to open the Select File dialog box.
3. Click OK.
Analytics projects
Analytics projects are the highest level of organization in Analytics, and they store the information
associated with a data analysis project.
The main Analytics project file (.ACL) stores most of the project information. A set of additional files
store particular types of information about the project, such as the log or indexes. Data is stored
outside the project in native Analytics data files, or in a database.
The Analytics project you are currently working with is displayed in the Overview tab in the
Navigator. The contents of the log are displayed in the Log tab. Only one project can be open at a
time.
Sample Project.ACL appears below in the Navigator.
Table An Analytics table, which consists of two parts: a table layout and an associated
data source.
The table layout contains information about how to display the data, such as
record length and field names. The data source is a file or data set (e.g., database
table) that contains the content of the table. The data source exists outside the
Analytics project.
Server Table A table with a table layout that resides locally in Analytics, and an associated data
source on a server. The table layout connects to the data source using a
database profile and/or server profile.
Script A series of ACLScript commands that can be run from within the Analytics project.
Server Script An ACLScript file (.aclscript, formerly .bat) that is located on a server.
Workspace An Analytics project item that contains one or more field definitions that have
been saved for reuse with other tables.
Project The top-level entry in the treeview is the Analytics project. Projects are stored in
physical files with a .ACL file extension.
Log A record of the commands issued when working with the Analytics project.
Folder A folder inside the Analytics project. These folders exist only in the Analytics
project file (.ACL). They are not physically created as Windows folders.
Session entry Individual sessions indicated by date and time. Sessions are created whenever
you open the project, or when you create a session manually.
File type
(extension) Description
The Analytics project file is where all of the critical information for your data analysis
project is stored:
o table layout and view definitions
o scripts
o project folders
Analytics Project file
o command syntax that updates tables using the Refresh from Source command.
o table history
(.ACL/.acl) o workspaces
In many cases, when you define an Analytics table from a data source, the data is
copied from the data source into a new Analytics data file with a .fil file extension.
Analytics data file
For a list of data source types that copy data to .fil files, see "Data sources you can
(.fil) access with Analytics" on page 227.
The log file records all commands executed by Analytics while the project is open.
Log file
The default log is assigned the same name as the Analytics project file, with a .log
(.log) extension. If necessary, you can specify a custom log file name.
An index file used to associate log entries with particular sessions. Sessions are
Log index file
created each time you open a project, and can also be created manually at any
(.lix) time.
An index file is created when you index an Analytics table. The file name is the
same as the name of the index in Analytics, with an .inx extension.
An index file is also created when you use the Quick Sort Ascending or Quick Sort
Index file
Descending commands on a table. The filename for indexes created by quick sort
(.inx) commands is ACLQSTMP.inx
These file types are not required by the project, however if they exist, you may want to include them in
any backup process.
File type
(extension) Description
Note
The combined length of the Analytics project path and the project name, including the
file extension (.acl), cannot exceed 259 characters.
o View
o Script
o Workspace
Note
When you copy a table, you are copying the table layout only, not the source
data file (.fil).
3. In the Locate Project File dialog box, locate and select the Analytics project you want to copy
the project items from and click Open.
4. In the Import dialog box, complete any of the following tasks to add project items to the To
project_name list:
o Double-click an individual project item.
o Ctrl+click multiple project items and then click the right-arrow button.
o Click Add All to add all the project items.
You can remove project items from the To project_name list by double-clicking an individual
project item, by using Ctrl+click to select multiple project items and then clicking the left-arrow
button, or by clicking Clear All .
5. Click OK to copy the project item or items into the current project.
If an item with the same name already exists in the project, the copied item is given an
incrementing numeric suffix.
.rpt view
.aclscript script
.wsp workspace
view .rpt
script .aclscript
workspace .wsp
Show me how
Note
Limit the item name to 64 alphanumeric characters, not including the file extension, to
ensure that the name is not truncated when the item is imported back into Analytics.
The name can include the underscore character ( _ ), but do not use any other special
characters, or any spaces, or start the name with a number. Special characters,
spaces, and a leading number are all replaced by the underscore character when the
item is imported.
Export a view
1. Open the table associated with the view.
2. In the Overview tab of the Navigator, right-click the table and select Properties > Views .
3. Select the view, and click Export.
4. In the Save As dialog box, choose a location to save the view, rename the view if required, and
click Save.
5. Click OK in the confirmation dialog box.
The view is exported to the location you specified.
Only matches with exactly matching upper and lower case are found.
For example, a search for “Comment” does not match “COMMENT” if this option is
Match case selected, but it would otherwise.
3. Right-click in the Log tab and select Save Selected Items > [Export Type] where Export Type
is one of the following options:
Note
If the Command Line isn't visible, select Window > Command Line from the
Analytics main menu.
Table layout notes appear in printed Analytics reports if Include Report History with Reports is
selected in the Options dialog box (the default setting). For more information, see "Print options" on
page 144.
Show me how
Note icon
Records that have a note attached are identified by a note icon next to the record number in the view
. Tables that have one or more records with a note are identified in the Overview tab in the
Navigator with a note icon in the bottom left corner of the standard table icon .
Steps
Show me how
Tip
To add or edit multiple record notes simultaneously, use the NOTES command.
1. Right-click the appropriate record number in the record number column in the View tab (the
grey, first column on the far left) and select Edit Note.
2. Enter a new note or edit the existing note.
To delete the note, delete all the text.
3. If you want to create a link to a related file, do the following:
a. Position the cursor at the location in the note where you want to insert the link.
2. Click Run .
Show me how
Note
Individually deleting all the record notes in a table does not delete the auto-generated
RecordNote field from the table layout, which means the note icon continues to
appear with the table icon in the Overview tab in the Navigator.
If your intention is to delete all the record notes in a table, use the method for deleting
all record notes, which also deletes the RecordNote field.
2. Click Run .
Show me how
1. Select Edit > Table Layout.
2. In the Edit Fields/Expressions tab, double-click the field you want to add a note to.
Note
If you choose to print the command log, the entire log is printed, which could be
numerous pages depending on the size of the log.
o Preferences – prints a list of the currently selected preference settings in the Options dialog
box
o Project Notes – prints any notes recorded for the project
o Log – prints the entire command log
o Page Break after each Category – inserts a page break after each project item category,
and after preferences, project notes, and log entries. If the checkbox is not selected, each
category is listed immediately after the previous category.
o Page Break after each Item – inserts a page break after each item within a category. For
example, if you have selected three scripts, a page break will be inserted after each of the
script definitions. If the checkbox is not selected, each item is listed immediately after the
previous item in the category.
5. Click Print.
6. In the Print dialog box, configure any necessary printer settings and click Print. You can use
the Print dialog box to modify settings, such as the printer to send the print job to and printer-
specific options such as the page size and page orientation.
Guidelines
When you attempt to open a project that closed unexpectedly, an ACL Analytics dialog box is
displayed presenting you with three options for recovering the project file. Select the appropriate
option from the following:
l Click Working if you made modifications to project items or performed analysis steps after you
last saved the project and you do not want to lose the log entries for these operations.
Note
The Working copy has the most complete information, but it may be corrupted if
Analytics closed while a command was being processed.
l Click Last-saved if any unsaved changes in the project are not important to you.
l Click Cancel if you want to retain the option of using either version of the project file. After you
close the dialog box, navigate to the Windows folder where the project files are stored and
create a backup of both the Working copy and the Last-saved version using different file
names.
Note
Analytics tables include a table layout, visible in the Navigator, and an associated
source data file with a .fil extension, not visible in the Navigator, stored in a Windows
folder.
Understanding the difference between the table layout and the source data file can be
important when saving results and specifying output folders.
For more information, see "The structure of Analytics tables" on page 115.
Saving results
When saving results to an Analytics table or a text file, you have the following options:
l Save – save the results to a new Analytics table or a text file
l Append – append the results to an existing Analytics table or a text file
l Overwrite – overwrite an existing Analytics table or a text file
Appending updates the source data file but does not alter the table layout. Overwriting replaces both
the source data file and the table layout.
Note
Some Analytics operations support saving result to either an Analytics table or a text
file, but not both.
Caution
Before saving results in this manner, you should be certain that overwriting source
data in another project is your intended outcome.
Table layout o the Analytics project folder containing the active table (the default)
o an Analytics project folder other than the active table folder, specified using the
SET FOLDER command
Source data file o the Windows folder containing the Analytics project (the default)
(.fil) o a Windows folder other than the folder containing the Analytics project
o the Prefix folder on AX Server (server tables only; the default)
o a folder on AX Server other than the Prefix folder (server tables only)
The output folder remains as whatever you set it – until you reset it, or close the project. When you
reopen the project, the output folder reverts to the default of the active table folder.
Note
File paths specified in the SET FOLDER command must use a forward slash.
SET FOLDER /Results The table layout is placed in the Results Analytics project folder rather than the
active table folder.
SET FOLDER /Result- The table layout is placed in the Duplicates Analytics project subfolder rather than
s/Duplicates the active table folder.
SET FOLDER / The table layout is placed in the Analytics project root directory rather than the
active table folder.
SET FOLDER Resets the output folder to the default of the active table folder.
Save results to new Analytics table Table layout added to same Table layout added to Analytics
Analytics project folder as active project folder specified by
table SET FOLDER command
Append results to existing Existing table layout not moved Existing table layout not moved
Analytics table in project
Save results and overwrite existing Table layout moved to same Table layout moved to Analytics
Analytics table in project Analytics project folder as active project folder specified by
table, unless already in same SET FOLDER command, unless
folder already in same folder
Append results to existing Table layout added to same Table layout added to Analytics
Analytics table in another project Analytics project folder as active project folder specified by
table SET FOLDER command
Tip:
To ensure the table layout is saved in the appropriate Analytics project folder,
begin the import process by right-clicking the folder.
4. Prior to performing an operation that saves results to an Analytics table, if necessary, use the
SET FOLDER command to specify the appropriate Analytics project folder for the resulting
table layout.
For more information, see "Saving results and specifying output folders" on page 189.
5. In the dialog box associated with the operation, specify the appropriate Windows folder for the
source data file using an absolute or relative file path, or by navigating to the folder.
For example: C:\Results\Classify.fil, or Results\Classify.fil.
Extracting data
Extracting allows you to copy some or all of the records or fields from an Analytics table to a new
Analytics table.
The new table can be:
l an identical copy containing all the source table records and fields
l a subset of the records in the source table
l a subset of the fields in the source table
l a subset of both the records and the fields in the source table
The existing sort order in the source table is maintained in the new table.
Note
Extracting data and appending it to the end of an existing Analytics table is a data
combining technique. It is explained in the section on combining data. For more
information, see "Extracting and appending data" on page 871.
Only fields that are currently displayed in the view are extracted. Any additional fields that
are part of the table layout but not displayed in the view are not extracted.
All fields in the view are extracted. If you want to extract a subset of fields, remove the
Which fields are unwanted fields from the view, create a new view with just the required fields, or use
extracted? extract by fields instead of extract by view.
The fields are extracted in the order they appear in the view. If you want to extract the
fields in a different order, rearrange them in the view, or create a new view with the fields
Field order in the desired order, prior to extracting.
If a filter is currently applied to the view, only the data that meets the filter criteria is
Filtering extracted.
Records notes are extracted only if the RecordNote column has previously been added
Record notes to the view.
If any alternate column titles are specified at the view level, extract by view preserves the
view-level titles. If you use the syntax in the command log to rerun the extract command,
Alternate column alternate column titles specified in the table layout are used, and view-level titles are
titles ignored.
Specifying extract by view is not supported in scripts or from the command line. When
Scripts
rendered in ACLScript, extract by view is actually an extract by fields ( EXTRACT FIELDS )
Command line using all the fields in the active view, in the order in which they appear in the view.
Steps
You can extract some or all of the records or fields from an Analytics table and output them to a new
Analytics table.
Note
Extracting data and appending it to the end of an existing Analytics table is a data
combining technique. It is explained in the section on combining data. For more
information, see "Extract and append data" on page 876.
Show me how
1. Open the table from which you want to extract records or fields.
2. Select Data > Extract.
3. On the Main tab, select one of the following:
l Record – extract entire records
The fields in the record are extracted in the order they appear in the table layout.
l View – extract all the fields in the current view
The fields are extracted in the order they appear in the view.
l Fields – extract a selection of individual fields
The fields are extracted in the order you select them.
If you want to extract data from a child table in a table relation, select Fields , or select View if
the child table fields have previously been added to the view. You cannot extract child table
data using the Record option.
Note
If you are extracting one or more computed fields, selecting Record preserves
the extracted fields as computed expressions.
Selecting View or Fields converts the extracted fields to physical fields of the
appropriate data type and populates them with the actual computed values.
Note
The If condition is evaluated against only the records remaining in a table after
any scope options have been applied (First, Next, While).
o Click To and specify the name of the new Analytics table, or select an existing table in the
Save or Save File As dialog box to overwrite the table.
If Analytics prefills a table name, you can accept the prefilled name, or change it.
You can also specify an absolute or relative file path, or navigate to a different folder, to save
the new table or overwrite an existing table in a location other than the project location. For
example: C:\Results\GL_2011.fil or Results\GL_2011.fil. Regardless of where you
save or overwrite the table, it is added to the open project if it is not already in the project.
7.l IfSelect
you are connected
Local to save to
thea output
server table
table,todothe
one of the
same following:
location as the project, or to specify a path or
navigate to a different local folder.
l Leave Local deselected to save the output table to the Prefix folder on a server.
Note
For output results produced from analysis or processing of Analytics Exchange
server tables, select Local . You cannot deselect the Local setting to import
results tables to Analytics Exchange.
Select Use Output Table if you want the output table to open automatically upon completion of
the operation.
8.
9. Click the More tab.
10. Select the appropriate option in the Scope panel:
l All
l First
l Next
l While
Show me more
All This option is selected by default. Leave it selected to specify that all records in the view are
processed.
First Select this option and enter a number in the text box to start processing at the first record in the
view and include only the specified number of records.
Next Select this option and enter a number in the text box to start processing at the currently selected
record in the view and include only the specified number of records. The actual record number
in the leftmost column must be selected, not data in the row.
While Select this option to use a WHILE statement to limit the processing of records in the view based
on a particular criterion or set of criteria. You can enter a condition in the While text box, or click
While to create a WHILE statement using the Expression Builder.
A WHILE statement allows records in the view to be processed only while the specified
condition evaluates to true. As soon as the condition evaluates to false, the processing
terminates, and no further records are considered. You can use the While option in conjunction
with the All, First, or Next options. Record processing stops as soon as one limit is reached.
Note
The number of records specified in the First or Next options references either the
physical or the indexed order of records in a table, and disregards any filtering or quick
sorting applied to the view. However, results of analytical operations respect any
filtering.
If a view is quick sorted, Next behaves like First.
11. If required, select EOF (End of file processing) to force the extract operation to execute one
more time when the end of a table is reached.
The EOF parameter is usually used if you are extracting records as part of a larger analytic
process and the Extract command occurs inside a group in a script. If you are extracting
records based on a comparison between sequential records, you may need to use EOF to
ensure the final record in a table is extracted.
12. Click OK.
13. If the overwrite prompt appears, select the appropriate option.
Exporting data
You can convert and export Analytics data to use in other applications. Destination applications and
formats include:
l Microsoft Excel (.xlsx, .xls)
l Text (.txt)
l Delimited text (.del)
l Comma-separated values (.csv)
l Microsoft Access (.mdb)
l Windows clipboard for pasting into other documents or applications
l XML (.xml)
l JSON (.json)
l dBASE III PLUS (.dbf)
l HighBond (export exceptions to Results)
For more information, see "Exporting exceptions to HighBond Results" on page 208.
Exporting to Excel
You can export Analytics tables as individual Excel worksheets to newly created or existing Excel
files. Exporting to an existing Excel file is supported for *.xlsx only.
Analytics tables that exceed these maximums export successfully, but the excess
records are ignored and not exported.
o a maximum of 64 characters
Length of field names o for Excel 2.1, a maximum of 248 characters
Steps
You can export some or all of the records or fields in an Analytics table to use in other applications.
Show me how
When you select this option, the fields are exported using the physical field names in the
table layout.
For information about renaming fields, see "Rename a field in a table layout" on page 750.
Tip
To export to a comma-separated values file (*.csv), select
Delimited and make sure to select a comma , in the Column
Separator drop-down list. When specifying the export file name in
the To field, include the .csv file extension. For example: vendors.csv
Note
If you specify a worksheet name, it can contain only alphanumeric
characters or the underscore character ( _ ). The name cannot
contain special characters, spaces, or start with a number.
You can overwrite a worksheet in an existing Excel file, but only if
the worksheet was originally created by exporting from
Analytics to Excel.
You cannot overwrite worksheets that were created directly in
Excel, or any worksheet that has been renamed.
Delete the name in the Add worksheet text box, and leave the text box empty.
When you overwrite an existing Excel file, a worksheet with the same name as
the Analytics table you are exporting from is automatically created in the
resulting Excel file.
XML o Optionally select Export with XML Schema to include the XML Schema in the
exported XML file.
The XML Schema contains metadata that describes the structure of the XML file,
including the data type of the fields. You can validate the file against the Schema
once the file has been exported.
Note
The Unicode option is available only when you export to Clipboard,
Delimited, Text, or XML.
For more information, see "Galvanize Unicode products" on
page 2682.
Note
The If condition is evaluated against only the records remaining in a table after
any scope options have been applied (First, Next, While).
The IF statement considers all records in the view and filters out those that do not meet the
specified condition.
2. Do one of the following:
o In the To text box, specify the name of the file that will contain the exported data.
o Click To and specify the file name, or select an existing file in the Save or Save File As
dialog box.
If Analytics prefills a table name, you can accept the prefilled name, or change it.
Note
If you are exporting data to the clipboard, the To text box is disabled because
you are not saving the data in a file.
l While
Show me more
All This option is selected by default. Leave it selected to specify that all records in the view are
processed.
First Select this option and enter a number in the text box to start processing at the first record in the
view and include only the specified number of records.
Next Select this option and enter a number in the text box to start processing at the currently selected
record in the view and include only the specified number of records. The actual record number
in the leftmost column must be selected, not data in the row.
While Select this option to use a WHILE statement to limit the processing of records in the view based
on a particular criterion or set of criteria. You can enter a condition in the While text box, or click
While to create a WHILE statement using the Expression Builder.
A WHILE statement allows records in the view to be processed only while the specified
condition evaluates to true. As soon as the condition evaluates to false, the processing
terminates, and no further records are considered. You can use the While option in conjunction
with the All, First, or Next options. Record processing stops as soon as one limit is reached.
Note
The number of records specified in the First or Next options references either the
physical or the indexed order of records in a table, and disregards any filtering or quick
sorting applied to the view. However, results of analytical operations respect any
filtering.
If a view is quick sorted, Next behaves like First.
5. If you are exporting to a delimited file or a text file, optionally select Append To Existing File if
you want to append the exported data to the end of an existing file.
6. Click OK.
7. If the overwrite prompt appears, select the appropriate option.
If you use HighBond, you can export exception data in an Analytics table to a table in Results. To
export exceptions, you use the standard procedure for exporting data from Analytics, with some
minor differences.
Security requirements
The ability to export exception data to a control test requires a specific HighBond role assignment, or
administrative privileges:
l Users with a Professional User or Professional Manager role for a Results collection can export
results to any control test in the collection.
Note
Only users with the Professional Manager role can overwrite existing data in a
control test.
l HighBond account admins and Results admins automatically get a Professional Manager role
in all collections in the HighBond instances they administer.
Password requirement
Password not required
You do not need to specify a password to export to Results if you used online activation to activate
your copy of Analytics. The password is automatically created and sent to Results based on activation
information stored on your computer.
Password required
You do need to specify a password to export to Results if you used offline activation to activate your
copy of Analytics. The required password value is a HighBond access token.
Note
A password is also required if you use a script to export to Results, and you run the
script in Robots, Analytics Exchange, or the Analysis App window.
to use and enter your HighBond account password. The unmasked token is displayed.
Tip
Use an existing token unless you have a reason for creating a new one. If the
existing token does not work, create a new one.
Using an existing token cuts down on the number of tokens you need to
manage.
l Create a new token – Click Create token > Analytics and enter your HighBond account
password.
A new Analytics token is created.
Note
If you are a Launchpad System Admin, you also have the option of creating
an API token. You should reserve API tokens for their intended purpose,
which is programmatic access to the HighBond platform.
Tip
Do not close the dialog box containing the token until you have successfully
pasted the token.
Caution
Safeguard your access tokens like any account password. They contain information
unique to your HighBond account. You should not share access tokens.
Export limits
The limits that apply when exporting to a control test in Results are shown below.
Within these limits, you can export multiple times to the same control test. If data already exists in the
control test, you have the option of overwriting it, or appending the new data.
Note
Although you can export up to 100,000 records to a control test, a better approach is
to create smaller, more focused exception sets.
Item Maximum
Note
When you append data to questionnaire fields, the display name of the column in
Results remains the name that is specified in the questionnaire configuration, even if
you changed the display name in Analytics.
Overwrite table option exported data is appended to the o matching value – if a matching value
not selected existing Results table exists in the primary key field in
Results and the corresponding field
exported from Analytics, the record
in Results is updated with the values
present in the exported record
o no matching value – if a matching
value does not exist in the primary
key field in Results and the
corresponding field exported from
Analytics, the record in Results is not
updated and the exported record is
appended to the table
Overwrite table option exported data replaces (overwrites) the exported data replaces (overwrites) the
selected existing Results table existing Results table
When you select this option, the fields are exported using the physical field names in the
table layout.
For information about renaming fields, see "Rename a field in a table layout" on page 750.
Tip
You can Ctrl+click to select multiple non-adjacent fields, and Shift+click to
select multiple adjacent fields.
If you want to append the exported data to the existing table in Results leave Overwrite table
deselected.
Note
Analytics fields can only be appended to existing Results fields if they have
matching physical field names, regardless of their display name in either
application. In Analytics, the physical field name is the name in the table
layout.
The order of fields within the two applications does not affect field name
matching.
Exported fields with physical names that do not match the physical name of a
field in the Results table create new columns in the table.
Include field Field name in Results is the field name from Analytics. Display name in Results is
display name the display name from Analytics.
selected
Include field Field name and display name in Field name and display name in
display name not Results are the field name from Results are the display name
selected Analytics. from Analytics.
Note
Do not select Include field display name if you are appending a view to a
Results table that was initially created by exporting a view from an Analytics
version older than 14.1. Doing so may export columns with field names that are
not the same as the names in Results, which will create new columns in Results
and misalign the data between applications.
Note
The If condition is evaluated against only the records remaining in a table after
any scope options have been applied (First, Next, While).
The IF statement considers all records in the view and filters out those that do not meet the
specified condition.
2. Do one of the following:
If you know the ID number of the table you are exporting to:
Enter the number in the To text box.
l Enter the number without any quotation marks – for example, 99
l Enter only the number. Do not enter a file name.
l If you are exporting to a data center other than North America, you must also specify the
data center code. The control test ID number and the data center code must be separated by
the at sign (@) – for example, 99@eu.
The data center code specifies which regional HighBond server you are exporting the data
to:
l ap – Asia Pacific
l au – Australia
l ca – Canada
l eu – Europe
l us – North America
The North America data center is the default, so specifying @us is optional.
If you do not know the ID number of the table you are exporting to, or if you want to create
a new table:
a. Click To, and in the Select Destination Test dialog box navigate to the appropriate analysis
folder.
b. Do one of the following:
l Select an existing table and click OK .
l Enter a name in the New data analytic field and click Create.
You are returned to the Export dialog box and the control test ID number and data center
code are prefilled in the To text box.
3. Click the More tab.
4. Select the appropriate option in the Scope panel:
l All
l First
l Next
l While
Show me more
All This option is selected by default. Leave it selected to specify that all records in the view are
processed.
First Select this option and enter a number in the text box to start processing at the first record in the
view and include only the specified number of records.
Next Select this option and enter a number in the text box to start processing at the currently selected
record in the view and include only the specified number of records. The actual record number
in the leftmost column must be selected, not data in the row.
While Select this option to use a WHILE statement to limit the processing of records in the view based
on a particular criterion or set of criteria. You can enter a condition in the While text box, or click
While to create a WHILE statement using the Expression Builder.
A WHILE statement allows records in the view to be processed only while the specified
condition evaluates to true. As soon as the condition evaluates to false, the processing
terminates, and no further records are considered. You can use the While option in conjunction
with the All, First, or Next options. Record processing stops as soon as one limit is reached.
Note
The number of records specified in the First or Next options references either the
physical or the indexed order of records in a table, and disregards any filtering or quick
sorting applied to the view. However, results of analytical operations respect any
filtering.
If a view is quick sorted, Next behaves like First.
5. Click OK.
A progress indicator appears while the exception data is exported to Results. When the export
operation is complete, an entry is created in the log.
were primary or unique keys in a source database. However, Analytics does not contain this
information.
The same situation is true when you directly access database tables using an Analytics database
profile. Analytics retains no information about which fields are key fields in the database, and you may
need to know this information yourself when constructing a database query.
Concatenating fields
If your analysis requires testing or processing two or more fields in a table as a single data element,
you can create a computed field that concatenates (adds together) the fields. You can then test or
process the combined data in the computed field.
For example, you could concatenate first, middle, and last name fields into a single field containing full
names, or concatenate vendor ID and location code fields to produce unique identifiers for each
outlet of every retail chain in a table.
Note
You can concatenate only character fields. If necessary, use Analytics functions to
convert non-character data prior to concatenating.
7. Click Accept Entry and click Close to exit the Table Layout dialog box.
For information about how to add the computed field to the view, see "Add columns to a view"
on page 777.
You can use Analytics to generate a set of random numbers. You can specify certain parameters,
such as the size of the set, and the range.
Typically, the set of generated values is used for applications outside Analytics, such as drawing a
random selection of hard-copy files.
Note
If you require a random selection to be statistical valid or representative of the entire
population, you need to follow a more formal sampling process. For more information,
see "Sampling data" on page 950.
Note
You should not select Unique when the specified size of the set of random
numbers exceeds 75 percent of the range between Minimum and
Maximum . Doing so can result in too many random number selections being
discarded.
o Sorted – Specifies that the set of random numbers is displayed in ascending order.
By default, the numbers are displayed in the order in which they are randomly selected.
o Append To Existing File – Specifies that the output results should be appended to the end
of an existing file instead of overwriting the existing file.
3. Click the Output tab.
4. Select the appropriate output option in the To panel:
o Screen – displays the set of random numbers in the Results tab in the Analytics display area.
o File – saves the set of random numbers to a text file.
5. If you selected File as the output type, specify the file name in the Name text box in the As
panel, or click Name and browse to select an existing file.
If the Append To Existing File checkbox is selected the output is appended to a file with the
same name, if found, otherwise you are prompted to either overwrite the file or append the
output.
You can also specify an absolute or relative file path, or navigate to a different folder, to save or
append the output to a file in a location other than the project location. For example:
C:\Output\random.txt or Output\random.txt.
6. Click OK.
Steps
1. Open the table from which you want to randomly select records.
2. From the main menu, select Sampling > Record/Monetary Unit Sampling > Sample.
3. Under Sample Type, select Record.
4. Under Sample Parameters , select Random .
5. Specify the following values:
l Size – the number of records that you want to randomly select
l Seed – (optional) a seed value to initialize the Analytics random number generator
The seed value can be any number. You can recreate the same random selection of records
by respecifying the same seed value.
Enter a seed value of ‘0’, or leave the seed blank, if you want Analytics to randomly select a
seed value.
lPopulation – the total number of records in the table
lTo – the name of the output table
6. Click OK.
Navigate or con- Navigate to a source data file, or connect to a file or a database containing the
1 nect source data.
Define the source data, which means: specify information about the structure and
characteristics of the source data so that Analytics can read it.
Note
Analytics automatically defines certain source data formats so that
2 Define user definition of the data is not required.
Import or read dir- Import the source data into a native Analytics data file, or read the data directly from
3 ectly the source without creating an Analytics data file.
Name and save the Name and save the automatically created Analytics table.
4 Analytics table
Note
When connecting to any data source, or importing from any data source, Analytics is
strictly read-only. For more information, see "Data access by Analytics is read-only"
on page 228.
No Yes
Import multiple tables (Yes for Excel) (up to 5)
Yes Yes
(basic) (modern interface, easily
Preview data import refreshable)
Note
The maximum record length supported by an Analytics data file (.fil) is 32 KB. Any
record that exceeds 32 KB causes the import process to fail.
Note
Certain requirements or prerequisites exist when using the Data Access window to
connect to a database or a cloud data service. For more information, see "Before you
connect to a database or a cloud data service" on page 357.
HighBond
External Definition
Note
You can use the Data Access window to access any ODBC-compliant data source,
not just the native data connectors listed below. For more information, see "Using the
Data Access window to access any ODBC data source" on page 227.
l Active Directory
l Amazon Athena
l Amazon DynamoDB
l Amazon Redshift
l Amazon S3
l Apache Cassandra
l Apache Drill
l Apache HBase
l Apache Hive
l Apache Spark
l AWS Data Management
l Azure Data Management
l Azure Table Storage
l Box
l Cloudera Impala
l Concur
l Couchbase
l DocuSign
l Dynamics CRM
l Dynamics GP
l Dynamics NAV
l Dynamics 365 Business Central
l Dynamics 365 Finance and Operations
l Dynamics 365 Sales
l Edgar Online
l Email
l Epicor ERP
l Exact Online
l Exchange
l Google BigQuery
l Jira
l JSON
l LDAP
l LinkedIn
l Marketo
l Microsoft SQL Server
l MongoDB
l MySQL
l NetSuite
l OData
l Open Exchange Rates
l Oracle
l Oracle Eloqua
l Oracle Sales Cloud
l Presto
l Qualys
l QuickBooks
l QuickBooks Online
l QuickBooks POS
l REST Services
l Rsam
l RSS/ATOM
l Sage 50 UK
l Sage Cloud Accounting
l Sage Intacct
l Salesforce
l SAP
(requires a separate subscription entitlement)
l SAP ByDesign
l SAP Hybris Cloud for Customer
l SAP SuccessFactors
l ServiceNow
l SFTP
l SharePoint
l Slack
l Snowflake
l Splunk
l Square
l Stripe
l SugarCRM
l SurveyMonkey
l Sybase
l Sybase IQ
l Tenable SecurityCenter
l Teradata
l Twitter
l UPS
l USPS
l xBase
l Zendesk
Note
You can also import data using the Data Access window. For more information, see
"Importing data using the Data Access window " on page 353.
When connecting to any data source, or importing from any data source, Analytics is
strictly read-only. For more information, see "Data access by Analytics is read-only"
on page 228.
Defining data
You may be required to define the data as you import it, which means to specify metadata such as:
l field names
l field lengths
l field data types
l format of numeric and datetime values
The image below shows the definition of the DATE field in an Excel worksheet being imported using
the Data Definition Wizard.
Import Microsoft Excel data to Analytics for analysis using a variety of different tools.
How it works
You use the Data Definition Wizard to select one or more Excel files, specify one or more worksheets
to import, and import the Excel data to Analytics. The imported data creates one or more new
Analytics tables and associated data files (.fil). Each imported worksheet creates a separate
Analytics table.
The Analytics data file contains a copy of the Excel data that is completely separate from the original
Excel file.
You can import data from an Excel file even if you do not have Microsoft Excel installed on your
computer.
Tip
To reduce labor, try combining the multiple tables first before making any required
adjustments to the data definition in the new combined table.
Guidelines
Review the guidelines below to assist you with importing Excel data.
Note
When the new table opens in Analytics, a maximum of 256 columns are displayed in
the default view. If the table contains additional columns, you can manually add them
to the view, if required.
Excel 97 – 2003
The import of .xls files (Excel 97 - 2003) uses an older type of processing, and is subject to the
following maximums:
l 255 columns
l 255 characters per field
l 32 KB per record
l 65,000 rows
The two parts of the table are overwritten independently of each other. If both parts have the same
name as the new table, both are overwritten. This scenario is the most common.
But if the table layout and the source data file have different names, only the one with the same name
as the new table is overwritten.
This overwrite behavior applies whether you are importing a single Excel worksheet, or multiple
worksheets.
Same "Output Path" as existing source Different "Output Path" from existing
data file source data file
"Overwrite existing o new table layout and source data file o new table layout and source data file
tables" not selec- created, with numeric suffix created, with numeric suffix
ted
Same "Output Path" as existing source Different "Output Path" from existing
data file source data file
Same "Output Path" as existing source Different "Output Path" from existing
data file source data file
o new table layout created o new table layout and source data file
o existing source data file overwritten created
"Overwrite existing o new and existing table layouts both o existing table layout and source data
tables" selected linked to source data file file preserved
o new table layout and source data file o new table layout and source data file
created, with numeric suffix created
o existing table layout and source data
For example:
file preserved
l table layout –Table_A2
l source data file –Table_A2.fil
"Overwrite existing
tables" not selec- o existing table layout and source data
ted file preserved
Same "Output Path" as existing source Different "Output Path" from existing
data file source data file
o existing table layout overwritten, linked o existing table layout overwritten, linked
to new source data file to new source data file
o new source data file created o new source data file created
"Overwrite existing o existing source data file preserved, o existing source data file preserved,
tables" selected unlinked unlinked
o new table layout and source data file o new table layout and source data file
created, with numeric suffix created, with numeric suffix
For example: For example:
l table layout –Table_A2 l table layout –Table_A2
l source data file –Table_A2.fil l source data file –Table_A2.fil
"Overwrite existing
tables" not selec- o existing table layout and source data o existing table layout and source data
ted file preserved file preserved
Note
Make sure the Excel file is closed before you begin the import process.
Note
To see any named ranges, deselect System Table Only .
Worksheets are identified with a dollar sign ($) appended to the worksheet name. The dollar
sign is added temporarily, and does not appear in the Analytics table name.
2. Review the default settings on the page, make any required updates, and click Next.
Setting Description
Values in the first row in the worksheet or the named range are used as field
names in the Analytics table.
Note
Use first row as Field If you use this setting, the row used as field names is whatever
Names line number is specified in Start On Line.
Note
The start line for a named range is always the first line in the
Start On Line named range, regardless of the Start On Line setting.
Import all fields as Assigns the Character data type to all the imported fields.
character type
Setting Description
Tip
Assigning the Character data type to all the imported fields
simplifies the process of importing Excel files.
Once the data is in Analytics, you can assign different data
types, such as Numeric or Datetime, to the fields, and specify
format details.
Import all fields as character type is useful if you are importing
a table with identifier fields automatically assigned the Numeric
data type by Analytics when in fact they should use the
Character data type.
Analytics uses only the first 100 records in the worksheet or the named range
to determine the data type of fields, and the length of fields, in the Analytics
table.
With large Excel files, using First 100 records significantly speeds up the
import process.
Caution
Select this option only if you are confident that values in the first
100 records accurately reflect the data type and length of all
subsequent values.
If any values after the first 100 records are of a different data
type, or are longer, the resulting Analytics table will contain
inaccurate or truncated data.
Inaccurate or truncated data in critical fields will very likely
First 100 records invalidate the results of subsequent data analysis.
Analytics uses all the records in the worksheet or the named range to
determine the data type of fields, and the length of fields, in the Analytics table.
With large Excel files, using all the records to determine data type and field
length significantly slows down the import process.
Property Description
Column Title The column title for the field in the default Analytics view.
If you do not specify a column title, the Name value is used.
Length The length of the field in the table layout. Specify the length in characters.
If a datetime field has no time data and displays 00:00:00 after the date, you can shorten
the length of the field to omit the empty time data.
Note
Maximum field length is 32,767 characters (non-Unicode edition) or
16,383 characters (Unicode edition). The entire field length, up to the
maximum, is imported into Analytics, but only the first 256 characters are
displayed in the table view. The remainder of the data is present, and can
be analyzed, but it is not visible in the view. To see all the data, open the
Table Layout dialog box.
Tip
Increase the length of a field if you selected First 100 records in the
previous screen, but you are uncertain about the length of subsequent
values in the field.
Note
If you selected Import all fields as character type in the Data Source page, the options below
do not apply and are disabled.
Value A read-only property that displays the first value in the field.
The value dynamically updates based on any edits you make.
Decimal Numeric fields only. The number of decimal places in the source data.
Note
The Decimal text box appears automatically when you select a Numeric
data type.
Input Format Datetime fields only. The format of datetime values in the source data.
Select a format that matches the data, or if necessary create a format to match the data.
The format you specify must exactly match the format in the source data.
For more information about date and time formats, see "Formats of date and time source
data" on page 347.
Property Description
Note
The Input Format text box appears automatically when you select a
Datetime data type.
Note
Make sure all Excel files are closed before you begin the import process.
All first rows in the worksheets and named ranges that you import should use a
consistent approach. First rows should be either field names, or data, across all data
sets. Avoid mixing the two approaches in a single import operation.
If the data sets have an inconsistent approach to first rows, use two separate import
operations.
Note
To see any named ranges, deselect System Table Only .
Select individual worksheets or named ranges, or select the first checkbox if you want to select
all the worksheets and named ranges in the Excel file or files.
Worksheets are identified with a dollar sign ($) appended to the worksheet name. The dollar
sign is added temporarily, and does not appear in the resulting Analytics table name.
2. Review the settings assigned by Analytics, make any required updates, and click Next.
Setting Description
Note
The table name applies to both the new table layout and the new
Table Name source data file created when importing the data.
Use first row as Values in the first row in each worksheet or named range are used as field names
Field Names in the resulting table layouts.
Setting Description
Note
If you use this setting, the row used as field names is whatever line
number is specified in Start On Line.
This setting applies globally to all worksheets and named ranges
that you import.
Note
The start line for a named range is always the first line in the named
Start On Line range, regardless of the Start On Line setting.
Prepend the Excel file name to the name of the Analytics table or tables.
Tip
Include File If worksheets in different files have the same name, prepending the
Name Excel file name allows you to avoid table name conflicts.
Tip
Assigning the Character data type to all the imported fields simplifies
the process of importing Excel files.
Once the data is in Analytics, you can assign different data types,
such as Numeric or Datetime, to the fields, and specify format details.
Import all fields as character type is useful if you are importing a
table with identifier fields automatically assigned the Numeric data
Import all fields type by Analytics when in fact they should use the Character data
as character type type.
Analytics uses only the first 100 records in the worksheet or named range to
determine the data type of fields, and the length of fields, in the resulting Analytics
tables.
With large Excel files, using First 100 records significantly speeds up the import
First 100 records process.
Setting Description
Caution
Use this option only if you are confident that values in the first 100
records accurately reflect the data type and length of all subsequent
values.
If any values after the first 100 records are of a different data type, or
are longer, the resulting Analytics table will contain inaccurate or
truncated data.
Inaccurate or truncated data in critical fields will very likely invalidate
the results of subsequent data analysis.
Analytics uses all the records in the worksheet or named range to determine the
data type of fields, and the length of fields, in the resulting Analytics tables.
With large Excel files, using all the records to determine data type and field length
significantly slows down the import process.
Specifies the folder where the new Analytics data files (.fil) are saved.
If you leave Output Path blank, the Analytics data files are saved in the folder
Output Path containing the Analytics project.
Note
If a numeric suffix has been added to a Table Name in the Final page, a table with the
same name already exists in the Analytics project and you have chosen not to
overwrite existing tables.
For detailed information, see "How overwriting works" on page 236.
You can create an Analytics table by defining and importing a Microsoft Access database file.
The Access file can be any version from Access 2000 to Access 2010. To import a file from an earlier
version of Access you can save the file in another file format that Analytics can define and import.
You can import an Access file even if you do not have Microsoft Access installed on your computer.
1. Select File > New > Table.
2. If the Select Platform for Data Source page is displayed, select Local and click Next.
3. Select File and click Next.
4. In Select File to Define, locate and select the file you want to create the Analytics table from
and click Open. Microsoft Access database files have a .mdb or .accdb file extension.
5. In the File Format page, verify that the Access database option has been selected and click
Next.
6. Complete the following steps in the Data Source page:
a. Select the table or view to import. The available options are listed in the Select the Access
Table/View list.
b. If you want to increase or decrease the maximum number of characters imported from text
fields, enter a new value in the Maximum Character Field Length text box.
You can enter from 1 to 255 characters.
c. If you want to increase or decrease the maximum number of characters imported from
memo or long text fields, enter a new value in the Maximum Memo Field Length text box.
You can enter from 1 to 32767 characters (non-Unicode Analytics), or 16383 characters
(Unicode Analytics).
d. Click Next.
7. In the Save Data File As dialog box, modify the filename and location for the Analytics data
file, as necessary, and click Save.
8. In the Final page, verify the settings for the new Analytics table and click Finish.
9. Enter a name for the Analytics table you are adding to your project and click OK.
Import a delimited text file to Analytics for analysis using a variety of different tools.
How it works
You use the Data Definition Wizard to select one or more delimited text files and import the data to
Analytics. The imported data creates one or more new Analytics tables and associated data files (.fil).
Each imported delimited text file creates a separate Analytics table.
The Analytics data file contains a copy of the delimited data that is completely separate from the
original delimited text file.
You can import delimited text files located on your local computer or on a network drive. Users of
Analytics Exchange can also access delimited text files located on an Analytics Server.
Tip
To reduce labor, try combining the multiple tables first before making any required
adjustments to the data definition in the new combined table.
First_Name,Last_Name,CardNum,EmpNo,HireDate,Salary,Bonus_2011
Lila,Remlawi,8590122497663807,000008,12/28/2007,52750,"$1,405.40"
Vladimir,Alexov,8590122281964011,000060,10/5/2007,41250,"$4,557.43"
Alex,Williams,8590124253621744,000104,8/12/2010,40175,"$7,460.02"
Same "Output Path" as existing source Different "Output Path" from existing
data file source data file
"Overwrite existing o error message "Existing file or table o error message "Existing file or table
tables" not selec- names detected" appears names detected" appears
ted
Same "Output Path" as existing source Different "Output Path" from existing
data file source data file
o new table layout created o new table layout and source data file
o existing source data file overwritten created
"Overwrite existing o new and existing table layouts both o existing table layout and source data
tables" selected linked to source data file file preserved
o error message "Existing file or table o new table layout and source data file
"Overwrite existing names detected" appears created
tables" not selec- o existing table layout and source data
ted file preserved
Same "Output Path" as existing source Different "Output Path" from existing
data file source data file
o existing table layout overwritten, linked o existing table layout overwritten, linked
to new source data file to new source data file
o new source data file created o new source data file created
"Overwrite existing o existing source data file preserved, o existing source data file preserved,
tables" selected unlinked unlinked
"Overwrite existing o error message "Existing file or table o error message "Existing file or table
tables" not selec- names detected" appears names detected" appears
ted
Start on Line The line number on which to start reading the file.
This setting allows you to skip lines at the beginning of a file that contain
information you do not want to import. For example, if the first three lines of a file
contain header information, enter 4 to start reading data on the fourth line.
Field Width For the selected column heading in the preview table, specifies the length of the
field in the resulting table layout. Specify the length in characters.
You can keep the length assigned by Analytics, or enter a different length.
Note
Maximum field length is 32,767 characters (non-Unicode edition) or
16,383 characters (Unicode edition). The entire field length, up to the
maximum, is imported into Analytics, but only the first 256 characters
are displayed in the table view. The remainder of the data is present,
and can be analyzed, but it is not visible in the view. To see all the
data, open the Table Layout dialog box.
Tip
If you intended to periodically refresh the resulting Analytics table
from updated source data, or re-use the import command, enter a
longer field length than the one assigned by Analytics.
A longer field length provides extra space if updated values in the
source data are longer than any of the current values. Values that
exceed the available field length are truncated.
Use first row as Values in the first line in the file are used as field names in the resulting table
field names layout.
Note
If you use this setting, the row used as field names is whatever line
number is specified in Start on Line. If the field names are not
correct, you can update them in a subsequent page in the Data
Definition Wizard.
Property Description
o Comma
o TAB
o Semicolon
o Other – allows you to specify the character that is used as the field separator
Text Qualifier The text symbol that identifies values contained in fields:
o Double Quote
o Single Quote
o None – indicates that no text qualifier is used
o Other – allows you to specify the character that is used as the text qualifier
Clear CR and Cleanses the imported data of misplaced carriage return (CR) and/or line feed (LF)
Clear LF characters.
Misplaced CR/LF characters can cause incorrect line breaks within records. When
enabled, the option replaces any CR/LF characters with a space. Only CR/LF
characters that occur inside a pair of text qualifiers are replaced.
For Windows files, select both Clear CR and Clear LF.
The two options are disabled if Text Qualifier is None.
All Character Assigns the Character data type to all the imported fields.
Tip
Assigning the Character data type to all the imported fields simplifies
the process of importing delimited text files.
Once the data is in Analytics, you can assign different data types,
such as Numeric or Datetime, to the fields, and specify format details.
The All Character option is useful if you are importing a table with
identifier fields automatically assigned the Numeric data type by
Analytics when in fact they should use the Character data type.
Note
Select a column heading in the preview table to see the properties associated with the
column.
Property Description
Ignore this field Excludes the field from the resulting table layout.
The data in the field is still imported, but it is undefined, and does not appear in the new
Analytics table. It can be defined later, if necessary, and added to the table.
Column Title The column title for the field in the default Analytics view.
If you do not specify a column title, the Name value is used.
Note
If you selected All Character in the Delimited File Properties page, the options below do not
apply and are disabled.
Type The data type assigned to the field in the resulting Analytics table.
You can keep the data type assigned by Analytics, or select an appropriate data type from
the drop-down list.
For information about the supported data types in Analytics, see "Data types in Analytics"
on page 740.
Value A read-only property that displays the first value in the field.
The value dynamically updates based on any edits you make.
Decimal Numeric fields only. The number of decimal places in the source data.
Note
The Decimal text box appears automatically when you select a Numeric
data type.
Input Format Datetime fields only. The format of datetime values in the source data.
The format you specify must exactly match the format in the source data.
For more information about date and time formats, see "Formats of date and time source
data" on page 347.
2. Enter a name for the table layout that you are adding to the project, or keep the default name,
and click OK.
The new Analytics table is created with data from the imported file.
Note
All first rows in the files that you import should use a consistent approach. First rows
should be either field names, or data, across all files. Avoid mixing the two
approaches in a single import operation.
If the files have an inconsistent approach to first rows, use two separate import
operations.
Setting Description
Note
The table name applies to both the new table layout and the new
source data file created when importing the data.
Specifies the folder where the new Analytics data files (.fil) are saved.
If you leave Output Path blank, the Analytics data files are saved in the folder
Output Path containing the Analytics project.
3. If the error message "Existing file or table names detected" appears, click OK and do one or
both of the following:
l Select Overwrite existing tables if any existing table layouts or associated data files with
1. In the Delimited File Properties page, review the settings assigned by Analytics to the
properties listed below, make any required updates, and click Next.
Property Description
Start on Line The line number on which to start reading the files.
This setting allows you to skip lines at the beginning of files that contain information
you do not want to import. For example, if the first three lines of each file contain
header information, enter 4 to start reading data on the fourth line.
Field Width For the selected column heading in the preview table, specifies the length of the
field in the resulting table layout. Specify the length in characters.
You can keep the length assigned by Analytics, or enter a different length.
Property Description
Note
Maximum field length is 32,767 characters (non-Unicode edition) or
16,383 characters (Unicode edition). The entire field length, up to the
maximum, is imported into Analytics, but only the first 256 characters
are displayed in the table view. The remainder of the data is present,
and can be analyzed, but it is not visible in the view. To see all the
data, open the Table Layout dialog box.
Tip
If you intended to periodically refresh a resulting Analytics table from
updated source data, or re-use the import command, enter a longer
field length than the one assigned by Analytics.
A longer field length provides extra space if updated values in the
source data are longer than any of the current values. Values that
exceed the available field length are truncated.
Use first row as Values in the first line in each file are used as field names in the resulting table
field names layouts.
Note
If you use this setting, the row used as field names is whatever line
number is specified in Start on Line.
This setting applies globally to all files that you import.
Text Qualifier The text symbol that identifies values contained in fields:
o Double Quote
o Single Quote
o None – indicates that no text qualifier is used
o Other – allows you to specify the character that is used as the text qualifier
Clear CR and Cleanses the imported data of misplaced carriage return (CR) and/or line feed (LF)
Clear LF characters.
Misplaced CR/LF characters can cause incorrect line breaks within records. When
enabled, the option replaces any CR/LF characters with a space. Only CR/LF
characters that occur inside a pair of text qualifiers are replaced.
For Windows files, select both Clear CR and Clear LF.
The two options are disabled if Text Qualifier is None.
Property Description
All Character Assigns the Character data type to all the imported fields.
Tip
Assigning the Character data type to all the imported fields simplifies
the process of importing delimited text files.
Once the data is in Analytics, you can assign different data types,
such as Numeric or Datetime, to the fields, and specify format details.
The All Character option is useful if you are importing a table with
identifier fields automatically assigned the Numeric data type by
Analytics when in fact they should use the Character data type.
Caution
Use control totals to verify that the Analytics table created from an imported print
image or PDF file contains all the data from the source file. Unintentionally excluding
records is a possibility when defining print image or PDF files. You should always
verify that you have a complete data set in Analytics before beginning any analysis.
General points
The file definition process is iterative
Successfully defining a print image or PDF file is typically an iterative process and may require a
certain amount of trial and error. You will need to perform some or all of the following individual tasks:
l define one or more fields
l define a set of detail records based on a unique value
l define one or more header or footer records
l modify or further build criteria to fine tune a captured set of records
l review each field and record definition for accuracy
l edit inaccurate field or record definitions
l make multiple passes through a file as one way of dealing with misaligned data
Misaligned data
Workarounds for misaligned data
In the Data Definition Wizard, misaligned data columns in a parsed PDF or print image file (see
"Aligned and misaligned data in a parsed PDF file" below) can make it difficult or labor-intensive to
create an Analytics table that is usable. If misaligned data is a significant problem, consider any of the
following approaches.
Note
The most suitable approach for your situation depends on the nature of the data you
are trying to define, and your experience with Analytics. New users of Analytics
should consider asking for the data in a different format.
l Go back to the source of the file and ask for the data in a different format.
l Try converting the file using conversion software — for example, software that converts a PDF
file to an Excel file or a text file. Import the converted file to Analytics.
l Try copying and pasting PDF data into a text editor. Then import the text file to Analytics.
l Use one or more of the following techniques to define misaligned fields:
l Create a field definition that is long enough to capture the leftmost and the rightmost
l Create a single, long field definition that encompasses multiple misaligned fields.
For more information, see "Defining misaligned fields in a print image or PDF file" on page 308.
l Import the source file more than once. With each import, define a different subset of records.
Append the resulting Analytics tables to assemble a complete data set.
For more information, see "Defining and importing subsets of print image or PDF data" on
page 312.
Note
Only aqua-blue highlighted fields become part of the resulting Analytics table.
Gray highlighted data in a defined record is ignored unless it is also defined as a field.
The gray portions of a record between defined fields are omitted in the resulting
Analytics table.
Completely undefined data is ignored. If you want to include any of this data in the
resulting Analytics table, you must define additional fields and records.
Location in
"The different kinds of data in
Kind of data Description Example a PDF file" on the next page
The basic content of a file, o credit card transac- #2, outlined in blue
arranged in records. tions
o inventory records
Defining detail data is
mandatory. You cannot define
a print image or PDF file
Detail data without defining detail data.
Location in
"The different kinds of data in
Kind of data Description Example a PDF file" on the next page
The identifying information that o store number and #1, outlined in red
appears above blocks or location where the
subsets of detail records. credit card transac-
tions took place
Defining header data is o “Product Class”
optional. If you do not need the
information
header information, you do not
Header data need to define it.
The information that appears o subtotaled credit card #3, outlined in aqua-blue
below blocks or subsets of transactions by store
detail records. o “Class Totals”
Defining footer data is optional.
If you do not need the footer
information, you do not need to
Footer data define it.
Additional guidelines
l You can define detail, header, or footer data in any order you want. A sequence is not
enforced.
l You can also specify field names (outlined in green in "The different kinds of data in a PDF file"
below). The method for specifying field names differs from the process for defining detail,
header, or footer data.
Note
Do not use Header data to attempt to define field names that may appear in a
print image or PDF file.
l In the initial data value you select to begin defining the initial data field
l In the same row as the initial data value
If required, you can change the exact match to a generic match, such as Numeric , or Non-Blank ,
which can provide greater flexibility when specifying a unique value. For more information, see
"Working with record definitions" on page 304.
When you select or specify a value, in a specific position, to capture a set of records, Analytics
considers any character or characters in that position, from the top of the file to the bottom, as it
searches for the value. Characters are considered even if they are outside those rows that you
consider record data. If the value you specified is not sufficiently precise, additional, non-record data
can be captured and included in the set of records.
Note
You can use criteria to either include or exclude rows in the source file.
Tip
If you use an initial detail field to capture detail records, but you do not want the field to
appear first in the resulting Analytics table, you can use the field to capture records,
and then delete it and re-add it.
Header and footer fields appear in the resulting Analytics table in the order in which you define them.
They appear before detail fields if you have not defined an initial detail field, and they appear after
detail fields once you have defined an initial detail field.
You also have the option of reordering fields once you have finished importing the print image or PDF
file into Analytics. You can drag columns to reorder them in a view. You can also extract by view if you
want to create a new table in which the fields in the table layout are physically reordered. For more
information, see "Extracting data" on page 194. You may find reordering fields in Analytics is easier
than trying to maintain a precise field order in the Data Definition Wizard.
Note
Only detail records are auto-defined. Header or footer data, if you require it, must be
manually defined.
3. Type a name for the field, update the data type if required, and click OK.
The value you selected is outlined with a box.
4. In the value you selected, or in the same row in the file, select one or more characters that are
unique to the set of records you want to capture.
For more information, see "Defining and importing print image (report) files and PDF files" on
page 261.
In the example below, the decimal point in the “Unit Cost” field is selected.
The Record Definition dialog box opens, and the initial field and the associated set of records
are defined.
The field is aqua-blue, and the records are gray. Undefined data continues to have a white
background.
5. If required, change the Record Type, or modify or further build the criteria used to capture the
set of records, and click OK.
7. When you have finished defining the fields you require, click Next.
The remainder of the defining and importing process is similar to the process for defining and
importing other data formats such as Excel and delimited text files.
For complete instructions, see "Define and import a print image file" on the facing page, or
"Define and import a PDF file" on page 291.
You can create an Analytics table by defining and importing a print image file.
When you use the Data Definition Wizard to process a print image file, Analytics may fully or partially
auto-define the file, or you may need to manually define the file.
Note
Defining print image files can be challenging. If you encounter problems, review
"Defining and importing print image (report) files and PDF files" on page 261.
Highlighting Meaning
Highlighting Meaning
Analytics was not able to detect a pattern in the data and could not auto-
define it.
If Analytics auto-defined the entire file perfectly, and you want to:
o update the generic field names (“Field_1”, “Field_2”, and so on),
go to "Edit the auto-definition" below
o If you want to add header or footer data to the detail data, go to
"Manually define the print image file" on the facing page
Tip
You can also update the generic field names in a sub-
If Analytics auto-defined the file sequent page in the Data Definition Wizard, which yo-
and you want to make updates u may find more convenient.
If the auto-definition:
o contains errors
o excludes data that you need
o includes data that you do not need
Tip
If the auto-definition contains significant errors, deletin-
If the auto-definition contains g the entire auto-definition and manually defining the f-
errors ile can be easier.
Right-click an aqua-blue field and select Edit Field, or double-click the field.
You can make a number of changes, including:
o updating the field name
o updating the data type
o under Advanced Options:
l changing the field length (Field Width)
l changing the starting position of the field
Edit a field definition For detailed information, see "Working with field definitions" on page 299.
Right-click a gray record and select Edit Record, or double-click the record.
You can make two main changes:
o update the categorization of the record – detail, header, and footer are the
options
o modify the criteria that Analytics used to capture the set of records
Edit a record definition For detailed information, see "Working with record definitions" on page 304.
Note
You are deleting the field definition or the record definition only, not
the actual data. If necessary, you can redefine the same field or
record data.
Tip
If you want to selectively delete records, select Edit Record and fine-
tune the criteria that Analytics used to capture the set of records.
Delete a field definition For detailed information, see "Working with record definitions" on
or a record definition page 304.
Note
You can also define a print image file using saved field and record definitions, if they
exist.
For more information, see "Define the print image file using a set of saved field and
record definitions" on the facing page.
1. In the Print Image File Definition page, select a data value to start defining one of the fields in
the table.
For example, you could select a social security number in an SSN field. When you select the
data value, the Field Definition dialog box opens.
Guidelines:
l You can select a value anywhere in the data. You do not have to use the first field in the
table, or select the first value in a field.
l The value you select can be detail data, header data, or footer data.
l Do not select field names. Leave all field names in the source file unselected. If you select
field names in the source file, Analytics treats them as data contained in fields.
l If field values vary in length, select the longest value, or select extra blank spaces to allow for
longer values that may be lower in the field and not currently displayed.
If you intend to use the initial data value you selected to uniquely identify a set of records, see
"Working with field definitions" on page 299.
2. Enter a name for the field, if necessary update the data type, and click OK.
3. In the data value you just selected, or in the same row in the file, select the character, or string
of characters, that uniquely identifies the set of records in the source file.
For example, select:
l a slash in a date value
l a decimal point in a numeric value
l a unique identifying value anywhere in the row containing the data value you selected
When you select the unique character or characters, the Record Definition dialog box opens,
and all records containing the character or characters are highlighted gray.
For detailed information, see "Defining and importing print image (report) files and PDF files"
on page 261.
If you need to define a record that extends beyond one row in the source file, see "Working with
multiline records and fields" on page 314.
4. If required, update the Record Type to match the type of data you are defining: detail, header,
or footer.
5. If required, modify the criteria used to capture the set of records.
For example, you could add additional criteria to omit some of the records that were initially
captured.
For detailed information, see "Working with record definitions" on page 304.
6. Click OK.
The field you defined is highlighted aqua-blue, and the associated set of captured records is
highlighted gray.
7. Scroll vertically to examine the defined field, and the associated set of captured records.
8. If the field is not defined correctly, or if the set of captured records needs adjustment, double-
click the field or the record, and make the necessary edits in the Field Definition dialog box, or
the Record Definition dialog box.
For more information, see "Working with field definitions" on page 299, or "Working with record
definitions" on page 304.
9. Define the remaining fields in the record by selecting a representative data value for each field.
Additional fields automatically conform to the set of records.
Guidelines:
l Define only those fields you want in the resulting Analytics table.
l With each field definition, scroll vertically to examine the defined field. Edit the definitions as
required.
For example, if data values are not fully contained by a field, you need to adjust the length or
starting position of the field, or both.
For more information, see "Edit the auto-definition" on page 284.
l If you need to define field values that extend beyond one row in the source file, see "Working
with multiline records and fields" on page 314.
Tip
The order in which you define detail fields is the order in which they appear in
the resulting Analytics table.
If you delete a detail field during the definition process, and then re-add it, it
loses its original position and is placed last among detail fields.
Define the print image file using a set of saved field and
record definitions
You can define a print image file using field and record definitions from a previous file definition
session that have been saved in a print image query file. The print image query file must already
exist, and the saved definitions must match the current data.
Note
Loading a print image query file deletes any current field and record definitions.
Note
Only load a file with definitions that you know match, or closely match, the
current data.
Note
Field and record definitions often represent a lot of work, and it is recommended
that you save them.
If you subsequently discover that the imported data needs an adjustment, and
must be redefined and reimported, saved definitions do not have to be
recreated from scratch.
2. When you are satisfied with all field and record definitions, click Next.
Note
If required, you can return to this point in the process and make updates to the
field and record definitions.
You can also navigate to a different folder to save the data file if you do not want to use the default
location opened by Analytics.
Note
Select a column heading in the preview table to see the properties associated with the
column.
Property Description
Ignore this field Excludes the field from the resulting table layout.
The data in the field is still imported, but it is undefined, and does not appear in the new
Analytics table. It can be defined later, if necessary, and added to the table.
Column Title The column title for the field in the default Analytics view.
If you do not specify a column title, the Name value is used.
Value A read-only property that displays the first value in the field.
The value dynamically updates based on any edits you make.
Decimal Numeric fields only. The number of decimal places in the source data.
Note
The Decimal text box appears automatically when you select a Numeric
data type.
Input Format Datetime fields only. The format of datetime values in the source data.
The format you specify must exactly match the format in the source data.
For more information about date and time formats, see "Formats of date and time source
data" on page 347.
You can create an Analytics table by defining and importing an Adobe PDF file.
When you use the Data Definition Wizard to process a PDF file, Analytics may fully or partially auto-
define the file, or you may need to manually define the file.
Note
Defining PDF files can be challenging. If you encounter problems, review "Defining
and importing print image (report) files and PDF files" on page 261.
Tip
In some circumstances, parsing a PDF file on a page-by-page basis can help
with data misalignment.
If you take this approach, you need to import the file more than once, create
more than one Analytics table, and then append the resulting tables in
Analytics.
For more information, see "Defining and importing subsets of print image or
PDF data" on page 312.
3. Leave the PDF Parser at the default setting of Xpdf, or select VeryPDF.
If you are importing the file for the first time, and you have no reason to do otherwise, leave the
setting at Xpdf.
If you have already encountered data alignment issues when using Xpdf with the file, select
VeryPDF to see if the parsing results are better.
4. Click Next.
The PDF file is parsed and the PDF File Definition page updates to display the parsed file.
5. Scroll vertically and horizontally to examine the parsed file.
Highlighting indicates whether Analytics has auto-defined data in the file:
Highlighting Meaning
Undefined data.
Analytics was not able to detect a pattern in the data and could not auto-
White background define it.
6. Optional. If the data in the parsed file is misaligned, click Back , switch the parser selection in
PDF Parser, and click Next.
The PDF file is re-parsed using the parser you selected, which may produce better data
alignment.
Any existing field and record definitions are deleted when you re-parse the file.
7. Do one of the following:
Result of auto-definition Action to take
If Analytics auto-defined the entire file perfectly, and you want to:
o update the generic field names (“Field_1”, “Field_2”, and so on),
go to "Edit the auto-definition" on the facing page
If Analytics auto-defined the file o If you want to add header or footer data to the detail data, go to
and you want to make updates "Manually define the PDF file" on page 294
Tip
You can also update the generic field names in a
subsequent page in the Data Definition Wizard,
which you may find more convenient.
If the auto-definition:
o contains errors
o excludes data that you need
o includes data that you do not need
Tip
If the auto-definition contains significant errors,
If the auto-definition contains deleting the entire auto-definition and manually
errors defining the file can be easier.
Right-click an aqua-blue field and select Edit Field, or double-click the field.
You can make a number of changes, including:
o updating the field name
o updating the data type
o under Advanced Options:
l changing the field length (Field Width)
l changing the starting position of the field
Edit a field definition For detailed information, see "Working with field definitions" on page 299.
Right-click a gray record and select Edit Record, or double-click the record.
You can make two main changes:
o update the categorization of the record – detail, header, and footer are the
options
o modify the criteria that Analytics used to capture the set of records
Edit a record definition For detailed information, see "Working with record definitions" on page 304.
Note
You are deleting the field definition or the record definition only, not
the actual data. If necessary, you can redefine the same field or
record data.
Tip
If you want to selectively delete records, select Edit Record and fine-
tune the criteria that Analytics used to capture the set of records.
Delete a field definition For detailed information, see "Working with record definitions" on
or a record definition page 304.
Note
You can also define a PDF file using saved field and record definitions, if they exist.
For more information, see "Define the PDF file using a set of saved field and record
definitions" on page 296.
1. In the PDF File Definition page, select a data value to start defining one of the fields in the
table.
For example, you could select a social security number in an SSN field. When you select the
data value, the Field Definition dialog box opens.
Guidelines:
l You can select a value anywhere in the data. You do not have to use the first field in the
table, or select the first value in a field.
l The value you select can be detail data, header data, or footer data.
l Do not select field names. Leave all field names in the source file unselected. If you select
field names in the source file, Analytics treats them as data contained in fields.
l If field values vary in length, select the longest value, or select extra blank spaces to allow for
longer values that may be lower in the field and not currently displayed.
If you intend to use the initial data value you selected to uniquely identify a set of records, see
"Working with field definitions" on page 299.
2. Enter a name for the field, if necessary update the data type, and click OK.
3. In the data value you just selected, or in the same row in the file, select the character, or string
of characters, that uniquely identifies the set of records in the source file.
For example, select:
l a slash in a date value
l a decimal point in a numeric value
l a unique identifying value anywhere in the row containing the data value you selected
When you select the unique character or characters, the Record Definition dialog box opens,
and all records containing the character or characters are highlighted gray.
For detailed information, see "Defining and importing print image (report) files and PDF files"
on page 261.
If you need to define a record that extends beyond one row in the source file, see "Working with
multiline records and fields" on page 314.
4. If required, update the Record Type to match the type of data you are defining: detail, header,
or footer.
5. If required, modify the criteria used to capture the set of records.
For example, you could add additional criteria to omit some of the records that were initially
captured.
For detailed information, see "Working with record definitions" on page 304.
6. Click OK.
The field you defined is highlighted aqua-blue, and the associated set of captured records is
highlighted gray.
7. Scroll vertically to examine the defined field, and the associated set of captured records.
8. If the field is not defined correctly, or if the set of captured records needs adjustment, double-
click the field or the record, and make the necessary edits in the Field Definition dialog box, or
the Record Definition dialog box.
For more information, see "Working with field definitions" on page 299, or "Working with record
definitions" on page 304.
9. Define the remaining fields in the record by selecting a representative data value for each field.
Additional fields automatically conform to the set of records.
Guidelines:
l Define only those fields you want in the resulting Analytics table.
l With each field definition, scroll vertically to examine the defined field. Edit the definitions as
required.
For example, if data values are not fully contained by a field, you need to adjust the length or
starting position of the field, or both.
For more information, see "Edit the auto-definition" on page 293.
l If you need to define field values that extend beyond one row in the source file, see "Working
with multiline records and fields" on page 314.
Tip
The order in which you define detail fields is the order in which they appear in
the resulting Analytics table.
If you delete a detail field during the definition process, and then re-add it, it
loses its original position and is placed last among detail fields.
Note
Loading a print image query file deletes any current field and record definitions.
Note
Only load a file with definitions that you know match, or closely match, the
current data.
Note
Field and record definitions often represent a lot of work, and it is recommended
that you save them.
If you subsequently discover that the imported data needs an adjustment, and
must be redefined and reimported, saved definitions do not have to be
recreated from scratch.
2. When you are satisfied with all field and record definitions, click Next.
Note
If required, you can return to this point in the process and make updates to the
field and record definitions.
Note
Select a column heading in the preview table to see the properties associated with the
column.
Property Description
Ignore this field Excludes the field from the resulting table layout.
The data in the field is still imported, but it is undefined, and does not appear in the new
Analytics table. It can be defined later, if necessary, and added to the table.
Column Title The column title for the field in the default Analytics view.
If you do not specify a column title, the Name value is used.
Value A read-only property that displays the first value in the field.
The value dynamically updates based on any edits you make.
Decimal Numeric fields only. The number of decimal places in the source data.
Note
The Decimal text box appears automatically when you select a Numeric
data type.
Input Format Datetime fields only. The format of datetime values in the source data.
The format you specify must exactly match the format in the source data.
For more information about date and time formats, see "Formats of date and time source
data" on page 347.
l an SSN field
You will have greater success using a consistently structured field than you will using a field
with varying contents.
l One or more consistently positioned characters in the field must be unique, or uniquely
positioned, when compared to data above or below the field.
l Avoid a field with missing values. It is possible to use a field with missing values, but it
complicates the process of defining the file.
Note
The value you use to uniquely identify a set of records does not have to be contained
in the initial data value or the initial data field. It can occur anywhere in the row
containing the initial data value. For more information, see "Defining and importing
print image (report) files and PDF files" on page 261.
The figure below shows the Field Definition dialog box with the Advanced Options expanded.
The table below explains the purpose of each item in the Field Definition dialog box:
Name Specifies a field name other than the generic field name assigned by Analytics.
The name you specify becomes the physical field name in the resulting Analytics table –
that is, the field name in the table layout.
Starts on Line Specifies which line in a record contains the start of the field.
For example:
o If each record containing the field appears on a single line, then the value must be ‘1’
o If each record containing the field spans two lines, and the field starts on the second
line, then the value must be ‘2’
Note
The starting position of a field is critical to the success of the defining and
importing process. Once a field is defined, scroll through the source file to
ensure the starting position accommodates all values in the field. Adjust
the starting position if necessary.
For Unicode data, typically you should specify an odd-numbered starting
byte position. Specifying an even-numbered starting position can cause
characters to display incorrectly.
Note
Field length is critical to the success of the defining and importing process.
Once a field is defined, scroll through the source file to ensure that the
field is long enough to accommodate all values in the field. Adjust the
length if necessary.
For Unicode data, specify an even number of bytes only. Specifying an
odd number of bytes can cause characters to display incorrectly.
Field Height Specifies the number of lines that constitute a single value in the field.
For example:
o If each value appears on a single line, then the field height must be ‘1’
o If each value spans two lines, then the field height must be ‘2’
o If each value spans a varying number of lines, such as the content of a Note field, set
the field height to accommodate the value that spans the greatest number of lines (see
Ends on blank line below)
Tip
If numeric or datetime data in the source file is inconsistently formatted,
you can import it as character data and try to clean up the inconsistencies
using Analytics functions in the resulting Analytics table.
Convert to single Specifies that multiline fields defined in the source file are imported to Analytics as a
field single field containing the data from all lines.
(character fields For example, if you define address data that spans several lines, selecting Convert to
only) single field creates a single field with all the address data on one line.
(multiline fields If you leave Convert to single field unselected (the default setting), multiline fields are
only) imported to Analytics as multiple fields each containing the data from a single line.
Fill if Blank Specifies that a field value is copied to subsequent blank values until a new field value
occurs.
For example, if the value “01” in the Product Class field appears in only the first record of
a block of Product Class 01 records, selecting Fill if Blank causes the value “01” to
appear with every record.
Ends on blank line Specifies that values in a multiline field terminate when they encounter a blank line.
(multiline fields This option addresses a situation that occurs when values in a multiline field span a
only) varying number of lines. You must set the Field Height to accommodate the value that
spans the greatest number of lines. However, doing so can cause a mismatch between
values with fewer lines and field or record boundaries. Selecting Ends on blank line
causes the field height, and the field or record boundaries, to dynamically resize to fit the
number of lines occupied by each value.
Note
This feature only works if one or more blank lines separate each value in
the multiline field.
Note
The Control Total setting in the Field Definition dialog box does not
create control totals when you import a print image or PDF file to
Analytics. For information about creating control totals for this purpose,
see "Defining and importing print image (report) files and PDF files" on
page 261.
Record Type: Specifies the type of data represented by the records: detail, header, or footer.
o Detail o Detail records – the main information in a file
o Header For example, in a file listing overdue invoices, the invoice entries are the detail
records. You can define only one set of detail records in a file.
o Footer
o Header records – the identifying information that appears above blocks or subsets of
detail records
For example, a file might list account information for each customer (header record),
followed by a list of each customer’s unpaid invoices (detail record). If necessary, you
can define more than one set of header records.
o Footer records – information that appears below blocks or subsets of detail records
For example, a file might list subtotals for each customer’s unpaid invoices (footer
record). If necessary, you can define more than one set of footer records.
Note
Although header and footer data is initially treated like a separate record
in the Data Definition Wizard, in the resulting Analytics table this data
becomes one or more additional fields, with repeated values, added to the
detail record.
Transparent Specifies that header records do not split multiline detail records.
(applies to header If a header record splits a multiline detail record in the source file, which can happen at a
records only) page break, selecting Transparent unifies the detail record in the resulting Analytics
table.
Record Name Allows you to customize the default record names that appear in the leftmost column in
the Data Definition Wizard.
You may find customizing a default name useful if you are creating multiple header or
footer records. The value appears in the Data Definition Wizard only and does not
appear in the resulting Analytics table.
Lines in Record Specifies the number of lines that constitute a single record in the source file.
For example, if each detail record in the source file appears on a single line, then the
value must be ‘1’. If each detail record spans three lines, then the value must be ‘3’.
Include or Exclude Specifies whether records that match the criteria should be included in, or excluded from,
the set of records.
(part of the criteria
builder) This menu contains the following options:
o Include – include records that match the criteria
o Exclude – exclude records that match the criteria
Match On Specifies the method to use, or the type of characters to use, to uniquely identify the set of
records in the file.
(part of the criteria
builder) This menu contains the following options:
o Exact Match – matching records must contain the character, or string of characters, in
the Text field, in the specified Line of the record, starting at the specified Start position
o Alpha – matching records must contain one or more alpha characters, in the specified
Line of the record, at the specified Start position, or in all positions of the specified
Range
o Numeric – matching records must contain one or more numeric characters, in the
specified Line of the record, at the specified Start position, or in all positions of the
specified Range
o Blank – matching records must contain one or more blank spaces, in the specified Line
of the record, at the specified Start position, or in all positions of the specified Range
o Non-Blank – matching records must contain one or more non-blank characters
(includes special characters), in the specified Line of the record, at the specified Start
position, or in all positions of the specified Range
o Find in Line – matching records must contain the character, or string of characters, in
the Text field anywhere in the specified Line of the record
o Find in Range – matching records must contain the character, or string of characters, in
the Text field, in the specified Line of the record, anywhere in the specified Range
o Custom Map – matching records must contain characters that match the character
pattern in the Text field, in the specified Line of the record, starting at the specified
Start position
The Custom Map option uses the same syntax as the MAP( ) function.
Text For Exact Match, Find in Line, or Find in Range, specifies the character, or string of
characters, that uniquely identifies the set of records in the file.
(part of the criteria
builder) For Custom Map, specifies the character pattern that uniquely identifies the set of records
in the file.
The field is disabled for the other Match On options.
Line Specifies which line of the record the criteria applies to.
(part of the criteria For example, if you create a custom map to match zip codes, and the zip codes appear on
builder) the third line of a three-line record, you must specify ‘3’ in Line.
For single-line records, the value is always ‘1’.
Logic Allows you to add or delete criteria, and specify the logical relations between criteria. You
can add a maximum of 8 criteria.
(part of the criteria
builder) This menu contains the following options:
o And – adds an additional criterion with a logical AND
o Or – adds an additional criterion with a logical OR
o Insert Criteria – inserts an empty criterion below the criterion to which it is applied
The criterion is initially inserted with a logical AND. You can change to a logical OR,
but only after you have specified values for the inserted criterion.
o Delete Criteria – deletes the criterion to which it is applied
o New Group – creates a separate criteria group
The New Group option allows you to build multiple criteria groups, which operate as
separate blocks of logic. The groups are related to one another with either a logical
OR or a logical AND.
o End – designates a criterion as the final criterion
Selecting End for a criterion deletes any subsequent criteria, including criteria in other
groups.
Tip
The Logic buttons may become unresponsive if you are missing values in
a criterion. Supply any missing values to reactivate the Logic buttons.
Note
If the values in a field vary in the number of words they contain, try to create a
separate field definition for these values, or ensure that these values
represent the last field at the end of a long field definition encompassing
multiple misaligned fields. The “Product Description” field in the sample
“Inventory.pdf” is an example of a field in which values vary in number of
words.
In Analytics, you will use the ALLTRIM( ), the REGEXREPLACE( ), and the SPLIT( )
functions to break up the single field into separate, aligned data elements.
3. Check the entire file to ensure that none of the values in the misaligned fields are outside the
aqua-blue highlighting of their field definition. Adjust the length of the field definition, if required.
4. Make sure that a data type of Character is specified for each field definition in the Field
Definition dialog box.
5. Complete the import process in the usual manner.
In the Edit Field Properties page, make sure that a data type of ASCII or UNICODE is
specified for every field.
For more information, see "Define and import a print image file" on page 283, or "Define and
import a PDF file" on page 291.
6. For a misaligned field with no data from an overlapping field, create a computed field in
Analytics that uses the following expression:
ALLTRIM(misaligned_field_name)
Leading and trailing spaces are removed from the field, which aligns all values in the field.
7. For a misaligned field that contains data from an overlapping field, do the following in Analytics:
a. Create an initial computed field that uses the following expression to replace one or more
spaces between the field value and the unwanted characters with a single space:
The expression also removes leading and trailing spaces from the misaligned field.
b. Create a second computed field that uses one of the following expressions to extract the field
value and discard the unwanted characters.
l If the unwanted characters are at the end of the field, use this expression:
SPLIT(initial_computed_field_name," ", 1)
l If the unwanted characters are at the beginning of the field, use this expression:
SPLIT(initial_computed_field_name," ", 2)
Tip
If unwanted characters are sometimes at the end of a field, and sometimes at
the beginning, or if they are present in only some of the records, you need to
create a conditional computed field that applies different versions of the
SPLIT( ) expression to different parts of the misaligned field. For example, the
condition RECNO( ) > 100 allows you to apply a version of the expression to
only those records beyond the first 100 records.
For more information, see "Define a conditional computed field" on page 735.
8. For a long field definition that encompasses multiple misaligned fields, do the following in
Analytics:
a. Create an initial computed field that uses the following expression to replace one or more
spaces between data elements with a single space:
The expression also removes leading and trailing spaces from the long field.
Tip
You may find including the OMIT( ) function in the expression is useful for
removing pieces of data that appear inconsistently and complicate
subsequent processing. For example, OMIT(ALLTRIM(REGEXREPLACE
(long_field_name, "\s+", " ")), "-") does the same as the expression above,
and also removes hyphens.
b. Create a second computed field that uses this expression to extract the first data element:
SPLIT(initial_computed_field_name," ", 1)
c. Create as many additional computed fields as required, using variations of the same
expression, to extract all the data elements.
For example:
SPLIT(initial_computed_field_name," ", 2)
SPLIT(initial_computed_field_name," ", 3)
To specify successive data elements, keep increasing the number in the segment parameter
of the SPLIT( ) function.
Note
For field values that contain more than one word, such as the values in the
“Product Description” field in the sample “Inventory.pdf”, this technique isolates
each word in a separate field. If required, you can reunite the values by
concatenating the separate fields. For more information, see "Concatenating
fields" on page 218.
9. Once you have finished extracting all the data elements to separate fields, do the following to
convert numeric and datetime data to the appropriate data type:
a. For numeric fields, create a computed field that uses this expression:
VALUE(field_name, number_of_decimal_places)
CTOD(field_name, "date_format")
Tip
You can save labor, and create fewer computed fields, by converting the data
type at the same time you apply functions to correct misaligned data. For
example:
VALUE(ALLTRIM(misaligned_field_name), 2)
10. Once you have created all the required computed fields, add them to the table view.
You do not need to add the initial computed field to the view, and you can remove any
misaligned fields, or long field or fields, from the view.
For more information, see "Add columns to a view" on page 777, or "Remove columns from a
view" on page 778.
Tip:
For PDF definition, you have the option of parsing the PDF file on a page-by-page
basis. In some cases, data misalignment occurs across page breaks. You may be
able to solve an alignment issue by using page-sized subsets of data.
Tip:
After importing the first subset, open the resulting table in Analytics, and
enter DISPLAY in the command line to display the data structure of the table
layout. Use the displayed table layout information as a guide for creating the
subsequent subsets of records and fields.
To save labor, use the generic Analytics field names (“Field_1”, “Field_2”,
and so on) when defining and importing subsets of records. Once you have
reassembled the data set in Analytics, you can rename all the fields in the
reassembled table.
2. When you save each Analytics data file, and each Analytics table layout, use an incrementing
numeric suffix to prevent overwriting tables you have already created. For example, “Table_
1.fil”, “Table_2.fil”, and so on.
3. Once you have defined and imported all the records in the source file, append the multiple
Analytics tables.
For more information, see "Extracting and appending data" on page 871.
In "Single-line and multiline fields" on the facing page, the customer name and street
address are defined in this manner.
To define multiple, single-line fields, select a single instance of the multiline data. By default,
Analytics creates a separate field for each line. For each field, an incrementing numeric
suffix is added to the field name.
l As a single field containing the data from all lines
In "Single-line and multiline fields" on the facing page, the comment values are defined as a
single field.
To define a single field containing the data from all lines, select a single instance of the
multiline data. In the Field Definition dialog box, under Advanced Options , select Convert
to single field.
terminate when they encounter a blank line. This feature only works if one or more blank lines
separate each value in the multiline field.
o Column Title – Enter the column title to display in the default Analytics view. If a column title
is not specified the Name value is used.
o Type – Select the appropriate data type from the drop-down list. For information about the
supported data types in Analytics, see "Data types in Analytics" on page 740.
The Decimal and Input Format text boxes appear automatically when you select the
corresponding data type.
o Value – A read-only property that displays the first value in the field. The value is updated
based on any edits you make.
o Decimal (numeric fields only) – Specify the appropriate number of decimal places.
o Input Format (datetime fields only) – Specify the format that matches the data. For more
information about date and time formats, see "Formats of date and time source data" on
page 347.
9. Click Next after you have finished editing the field properties you want to change.
10. In the Final page, verify the settings for the new Analytics table and click Finish.
11. Enter a name for the Analytics table you are adding to the project, or keep the default name,
and click OK.
Column names can have a maximum length of ten characters. The first ten characters of
Column names each field must be unique or the duplicate fields cannot be exported.
Field names Field names must be specified in the first row, and data must start in the second row.
Each column should contain values of only one data type. For example, if the first value in
Data type a field contains character data, the field will be exported as character data.
Fields that contain only numbers will be exported as numeric data. In some cases, this
will result in a field with the wrong data type in Analytics. For example, invoice numbers
Fields with num- are numeric values, but they are often stored in character fields. When this occurs, you
bers need to change the field data type in the Table Layout dialog box.
Steps
1. Select File > New > Table.
The first page displayed in the Data Definition Wizard depends on your configuration. If
integration with Analytics Server is enabled the Select Platform for Data Source page is
displayed, otherwise the Select Local Data Source page is displayed.
2. Complete one of the following steps to select the location of the file:
o If the Select Platform for Data Source page is displayed and you want to use Analytics to
define the file, select Local and click Next. In the Select Local Data Source page select
File and click Next.
o If the Select Platform for Data Source page is displayed and you want to use an Analytics
Server to define the file, select ACL Server and select the Windows server profile from the
drop-down list, and then click Next. In the Select ACL Server Data Source page select
Flat Files and click Next.
o If the Select Local Data Source page is displayed select File and click Next.
3. In the Select File to Define page, locate and select the file you want to create the Analytics
table from and click Open.
dBASE-compatible files have a .dbf file extension.
4. In the File Format page, verify that the dBASE compatible file option has been selected and
click Next.
5. In the Final page, verify the settings for the new Analytics table and click Finish.
6. Enter a name for the Analytics table you are adding to your project and click OK.
You can create an Analytics table by defining and importing an SAP Audit Format file (formerly
known as the SAP Private File Format). SAP Audit Format files have a .dat file extension.
You can define and import Audit Format files located on your local computer or on a network drive.
Note
Galvanize offers two utilities for directly accessing an SAP system and importing data
to Analytics:
l SAP connector – one of the Analytics connectors, available with an additional
subscription
l Direct Link – an optional add-on that can be purchased from Galvanize
Steps
1. Select File > New >Table .
The first page displayed in the Data Definition Wizard depends on your configuration. If
integration with Analytics Server is enabled the Select Platform for Data Source page is
displayed, otherwise the Select Local Data Source page is displayed.
2. Complete one of the following steps to select the location of the file:
o If the Select Platform for Data Source page is displayed and you want to use Analytics to
define the file, select Local and click Next. In the Select Local Data Source page select
File and click Next.
o If the Select Platform for Data Source page is displayed and you want to use an Analytics
Server to define the file, select ACL Server and select the server profile from the drop-down
list, and then click Next. In the Select ACL Server Data Source page select Flat Files and
click Next.
o If the Select Local Data Source page is displayed select File and click Next.
3. In Select File to Define, locate and select the file you want to create the Analytics table from
and click Open.
4. In the Character Set page, verify that the correct character set option has been selected and
click Next.
5. In the File Format page, verify that the SAP private file format / DART option has been
selected and click Next.
6. In the SAP Private File Format page, select the appropriate option for field naming:
l Use local language field descriptions as ACL field names – Select this option to use the
localized field descriptions configured for the SAP system, instead of the standard German
language field names. This option is recommended if the Analytics table will be used in only
one language.
l Use standard-delivered SAP German abbreviations as ACL field names – Select this
option if you prefer to use the German field names, or if the Analytics table will be used in
multiple languages.
7. Click Next.
8. In the Save Converted SAP File As dialog box, enter the file name and modify the folder
location for the Analytics data file, as necessary, and click Save.
9. In the Final page, verify the settings for the new Analytics table and click Finish.
10. Enter a name for the Analytics table you are adding to your project and click OK.
You can create an Analytics table by defining and importing an XML file. The Data Definition Wizard
allows you to select the XML elements to import, configure the structure of the resulting Analytics
table, and customize column headings and data types for the elements you are importing.
Note
In some cases, you may need to adjust one or more field definitions in the resulting
Analytics table so that the data displayed in the view accurately reflects the data in the
source XML file. You adjust field definitions in the Analytics table layout.
Analytics imports the exact raw data contained in an XML file, and you can see this
source data in the table layout. On occasion, a field definition created during the table
definition and import process misinterprets the source data, and the definition
requires subsequent adjustment. For example, a numeric field could be misinter-
preted as a date field, and dates rather than numbers may initially appear in the view.
Note
Generating a preview of the data in a large XML file can be slow, so the Auto
Preview option is automatically deselected for XML files larger than 2 GB.
c. Select and add all the data structures you want to include in the Analytics table.
d. If necessary, select a data structure in the Preview pane and click Remove to remove it.
e. Click Next.
An XML data structure is a collection of XML elements and attributes. For more information,
see "Selecting XML data structures" on page 328.
7. In the Select XML Elements page, fine-tune the selection of XML elements and attributes and
click Next.
For more information, see "Selecting and configuring XML elements" on page 329.
8. In the Preview Data page, modify the name or properties for any field, if necessary.
To modify field properties, select the appropriate column heading in the preview table, in the
bottom half of the page, and update any of the following properties:
o Name – Keep the name assigned by Analytics for the field in the table layout, or enter a
different name.
o Column Title – Enter the column title to display in the default Analytics view. If a column title
is not specified the Name value is used.
o Type – Select the appropriate data type from the drop-down list. For information about the
supported data types in Analytics, see "Data types in Analytics" on page 740.
The Decimal and Input text boxes appear automatically when you select the corresponding
data type.
o Value – A read-only property that displays the first value in the field. The value is updated
based on any edits you make.
o Decimal (numeric fields only) – Specify the appropriate number of decimal places.
o Input (datetime fields only) – Specify the format that matches the data. For more information
about date and time formats, see "Formats of date and time source data" on page 347.
9. Click Next.
10. In the Save Data File As dialog box, enter a name for the Analytics data file, and if necessary
modify the location where the file will be saved, and click Save.
11. In the Final page, verify the settings for the new Analytics table and click Finish.
12. Enter a name for the Analytics table you are adding to the project, or keep the default name,
and click OK.
13. Review the data in the new Analytics table and update any field definitions, if required.
If a field has the wrong data type specified, the data may not appear in the view, or the data
may be misinterpreted. For example, a numeric value may be interpreted as a date.
14. To update a field definition, do the following:
a. Select Edit > Table Layout.
b. In the Edit Fields/Expressions tab, double-click the field you want to modify.
c. Make the necessary changes to the field definition and click Accept Entry .
For example, you may need to change the Type from Datetime to Numeric .
Note
If more than one instance of a nested element exists within the data structure,
the repeated element may not be listed in the data structure in the treeview. You
can select the specific instances of the repeated element in a subsequent page
of the wizard.
Note
Generating a preview of the data in a large XML file can be slow, so the Auto
Preview option is automatically deselected for XML files larger than 2 GB.
3. Repeat steps 1 to 2 to add any additional data structures that you want to include in the
Analytics table.
4. Click Next.
Note
Generating a preview of the data in a large XML file can be slow, so the Auto Preview
option is automatically deselected for XML files larger than 2 GB.
Note
If adding an element violates your intended Analytics table structure and causes
gaps to appear in the table, data of the same kind probably needs to be merged
in a single column (step 4 below).
If adding an element creates duplicate or multiple identical records (identical
except for the added element), more than one instance of the element probably
exists within a parent element and the instances need to be split into separate
columns.
3. If you want to move a column, select the column and click the left or right arrow button, or drag
the column to a new position.
4. If you want to merge data of the same type in a column, select the column in the Preview table,
select the element or attribute to add in the XML Elements treeview, and click Add to Column.
5. If you want to modify column properties, select the column in the Preview table and click
Column Properties .
Make any of the following changes in the XML Column Properties dialog box and click OK:
l Change the name of the column.
l Change the data type of a column.
l If the column is assigned the Text data type, repeat the name of the column in each row of
the column.
l If an element is repeated in the data structure, assign specific instances of the repeated
element to the column. For example, if there are multiple <description> elements in a data
structure, you could assign only the first instance to a Description column.
l Remove a specific column from a merged column.
6. Click Next.
<catalog>
<cd>
<title></title>
</cd>
</catalog>
<organization>
<company/>
<description/>
<department/>
<description/>
</organization>
You can enter a single number, multiple numbers separated by commas, a numeric range, or a
combination of these. For example, if you want to include the first instance, and the fifth to tenth
instances, in a single column, enter the following: 1, 5-10. By default, all instances are initially
displayed in the column.
Tip:
If you want to add different instances of a repeated element to different columns
in an Analytics table, you need to create a column for each instance and set the
Instance field to the appropriate value.
6. Click OK.
File extensions
XML files typically use a standard file extension (.xml). In some cases, other file extensions are used,
and the first line of the document identifies it as an XML file. If a non-standard file extension is used,
you need to manually select the XML file format in the Data Definition Wizard.
<name>John Smith</name>
An attribute provides additional information about an element. In the following example, the type
attribute specifies that the account element represents a checking account:
<account type="checking">991110101</account>
In the Data Definition Wizard, attribute names are automatically preceded by the @ symbol to
distinguish them from element names. For example, an attribute named “type” is displayed as
“@type”.
XML sample
XML files usually include a mixture of elements and attributes, and at least one data structure. The
following example shows the contents of a simple XML file that contains two client records:
<?xml version="1.0"?>
<accounts>
<client>
<name>John Smith</name>
<ID>JS004</ID>
<account type="checking">991110101</account>
<account type="savings">991110808</account>
</client>
<client>
<name>Jane Smith</name>
<ID>JS005</ID>
<account type="checking">771110103</account>
<account type="savings">771110303</account>
</client>
</accounts>
You can create an Analytics table by defining and importing an XBRL file. The Data Definition
Wizard allows you to select the elements to import, and customize column headings and data types
for the elements you are importing.
1. Select File > New > Table.
2. If the Select Platform for Data Source page is displayed, select Local and click Next.
3. In the Select Local Data Source page, select File and click Next.
4. In the Select File to Define dialog box, locate and select the file you want to create the
Analytics table from and click Open.
XBRL 2.1 files have a .xbrl or .xml file extension. The difference between an XBRL file and
other XML files is that in the XBRL file the top-level, or root, element tag is <xbrl>.
5. In the File Format page, verify that the XBRL 2.1 file option has been selected and click Next.
6. In the Select XBRL Contexts to Import page, select the XBRL contexts to include in the
Analytics table and click Next. For details on this process, see "Selecting XBRL contexts" on
page 338.
7. In the Select Elements to Import page, select the elements to include in the Analytics table
and click Next. For details on this process, see "Selecting XBRL elements" on page 337.
8. In the Preview Data page, you can modify the name and properties for each field by selecting
the appropriate column heading in the preview table, in the bottom half of the page, and
updating any of the following properties:
o Ignore this field – If you do not want the field to be included in the Analytics table layout,
select this checkbox.
o Name – Keep the name assigned by Analytics for the field in the table layout, or enter a
different name.
o Column Title – Enter the column title to display in the default Analytics view. If a column title
is not specified the Name value is used.
o Type – Select the appropriate data type from the drop-down list. For information about the
supported data types in Analytics, see "Data types in Analytics" on page 740.
The Decimal and Input text boxes appear automatically when you select the corresponding
data type.
o Value – A read-only property that displays the first value in the field. The value is updated
based on any edits you make.
o Decimal (numeric fields only) – Specify the appropriate number of decimal places.
o Input (datetime fields only) – Specify the format that matches the data. For more information
about date and time formats, see "Formats of date and time source data" on page 347.
9. Click Next after you have finished editing the field properties you want to change.
10. In the Save Data File As dialog box, enter a name for the Analytics data file, and if necessary
modify the location where the file will be saved, and click Save.
11. In the Final page, verify the settings for the new Analytics table and click Finish.
12. Enter a name for the Analytics table you are adding to the project, or keep the default name,
and click OK.
Note
If you are working with a complex XBRL file you may need to, or may find it easier to,
define more than one Analytics table for the various contexts in the file, and then
define relations between the tables using the Relate Tables command.
Steps
1. Select File > New > Table.
2. In the Select Platform for Data Source page, select ACL Server and select the Windows
server profile to use from the drop-down list, and click Next.
3. In the Select ACL Server Data Source page, select Database Profile and select the
database profile to use from the drop-down list, and click Next.
4. In the Select Database/Schema page, select the schema (Oracle) or database (SQL Server
and IBM DB2) to access from the Schema drop-down list and click Next.
5. In the Select Tables page, select the database tables, views, and synonyms/aliases to add to
your query by selecting the item in the Available Tables list and clicking the right-arrow button.
You can select up to five tables, but if you select more than one table, each additional table you
select must be related to a previously selected table. Selecting multiple tables, particularly
tables with large numbers of records, will typically result in longer wait times before data is
displayed in Analytics.
When you select more than one table, Analytics displays the Identify Relationship dialog box
which you must use to identify the field in the table you are adding that relates to a table that