Go to Top

Capitalize Analytics Blog

Invitation: North Texas IBM Cognos User Group Meeting

Capitalize Analytics is sponsoring the next North Texas IBM Cognos User Group Meeting on Thursday, August 16, 2018. We, along with IBM, would like to invite you to attend!

The meeting will run from 8:30 am to 1:00 pm with a continental breakfast and lunch provided.

Some of the topics covered will be:

  • Cognos Analytics Tips and Tricks: Use Cognos Analytics to Update Databases, Call Webservices, & Send Emails with Multiple Attached Documents/Reports by Capitalize Analytics
  • Cognos Analytics at Scale: How Cognos Analytics is Enabling Success at 100s of School Districts Nationwide
  • Cognos Analytics in a Diverse Technology World: How Other BI/ETL/Predictive Technologies Work with Cognos as Part of an Enterprise Data Analytics Strategy
  • IBM Watson Studio: Build/Train AI Model in One Integrated Environment

To register, please click here:

Registration Link

The meeting will take place at the IBM offices located at Beltline in Coppell.

1177 South Belt Line Road

Coppell TX 75019-4652

*Enter through the customer entrance – the front doors facing Beltline.

We look forward to seeing you there!

DataRobot: Capitalizing on the AI/Machine Learning Revolution

Here at Capitalize, we continually search out emerging technologies that are revolutionizing operations and insights for our customers. For the past several years, leaders in their respective industries have been focused on digitalization and improving infrastructures. These initiatives have been purpose-built to arm their data scientists with the information required for advanced analytics.

In fact, Forrester published an article in 2017 about artificial intelligence (AI) and how it will drive the Insights revolution. They have predicted that “AI companies will take $1.2 trillion from competitors by 2020.” Companies who fail to embrace this new revolution will be faced with a steep challenge ahead to maintain market share.

With so many different use cases and potential for AI and Automated Machine Learning across every industry, Capitalize Analytics wants to make sure we find and deliver technologies that position our customers for success. We search for easy-to-use, quick-to-implement, reliable products to recommend to our clients interested in placing themselves at the head of the artificial intelligence revolution.

Recently we reached out to DataRobot, an artificial intelligence and Automated Machine Learning platform with a vision for enabling data-driven organizations to embrace this innovation. Their goal is to make AI easy to use, easy to understand, and easy to apply, and to make it applicable to every business process to predict outcomes. This allows the data-driven enterprise to adapt to new conditions at incredible speeds and continually self-optimize based on predictions of their future state.

While AI is conceptually appealing, it does present a few challenges at the start, the most obvious of which is staffing. Organizations today are faced with a lack of experienced data scientists on staff, due to high demand and low supply in the job market. Today, predictive initiatives require individuals possessing a blend of:

  • Domain Expertise with knowledge of the business and its issues and most importantly, knowledge of the available data.
  • Programming Skills including the ability to write code which is necessary to gather, interrogate, manipulate the data to extract actionable insights and build and implement the proper models.
  • Advanced Mathematics and Statistics Skills with knowledge of applicable algorithms and the experience to interpret and explain their findings.


We were excited to learn that with their Automated ML platform, DataRobot reduces dependency on this difficult-to-find programming, mathematical, and statistical expertise. This reduced dependency on internal resources frees up the business to gain greater insight through:

  • Accessibility, requiring little to no data science experience required to get started.
  • Speed, training, and testing hundreds of models in a fraction of the time it takes the average data scientist to create one model.
  • Transparency, this is not a black box environment enabling users to see everything that is happening under the hood.
  • Mass Model Production, speed of model creation turns your users into a mass-producing model factory.
  • Deployment, no re-coding is required to deploy models, and this deployment can be completed in a manner of minutes.


But what good does this predictive technology do without the ability to combine, visualize, distribute, and understand its findings? Integrating with today’s data technology leaders, DataRobot helps bridge several solutions together to develop an advanced analytics ecosystem. They do this by combing data from on-premises big data warehouses like Cloudera and Hortonworks; utilizing Cloud analytics platforms like AWS, Azure, and Google cloud; then leveraging data preparation tools such as Alteryx, Trifecta, and Paxata.

The final step in delivering these predictions to the organization is accomplished through integrations with data visualization tools such as Qlik and Tableau. In this way, DataRobot can truly make everyone in the organization a “Citizen Data Scientist.”


To read what Forbes is saying about DataRobot click here: DataRobot Puts The Power Of Machine Learning In The Hands Of Business Analysts – Forbes

To watch a recording of our webinar and see a brief demonstration of this powerful platform click here: Capitalize Vendor Spotlight: DataRobot


Alteryx Solution for Updating Workers’ Compensation Code on Employee Records

There are many times when we need to get large amounts of data into a system, but the front end of the application doesn’t provide a way to do it quickly. This creates a dilemma of how to get hundreds or thousands of items loaded, without doing it by hand, which could take days. Alteryx is a solution that makes data movement, blending, and loading quick and easy. It is also a solution that requires ZERO SQL, or other coding skills so more users can be part of the process.

During a new implementation of PowerSchool’s eFinancePLUS, a district wanted to use the system’s ability to calculate the workers’ compensation premiums during the pay run processing. They created the appropriate codes and assigned them to the right job classes for the processing to occur. However, the codes were not able to be added en masse to individual employee records for later use in running informational reports.

Because the district had added the codes to employee records in a spreadsheet, an Alteryx workflow was created to update the appropriate table in the database with the assigned codes. An explanation of the workflow process follows.

The actual SQL server/database connection information is hard-coded in the workflow.

The demographic conversion spreadsheet with employee information has many columns, but it must have the [EMPLOYEE NUMBER] and [WORKCOMP] columns. It can only contain one sheet.

Since the directory path and name of the spreadsheet could vary, a ‘File Browse’ Interface Tool is used to ask the user to locate the desired file. Text is displayed that instructs the user on what to do.


An ‘Action’ Interface Tool is used to tell Alteryx what to do with the information. In this case, the action type is ‘Update Input Data Tool (Default)’ and it is required.


An ‘Input Data’ tool is used to define information about the spreadsheet data. The default name assigned in the configuration will be replaced with the file name selected by the user in the previous step.


A ‘Select’ tool is used to keep only the employee number and workers comp code from all the columns in the spreadsheet. The Size field of the workers comp code was changed from the default setting of 255 to 4, as that’s how it’s defined in the database table.


A ‘Filter’ tool is used with a Custom Filter to keep only the rows where both the employee number and the workers comp code are not Null. The expression used is:


This tool has two outputs – T(rue) and F(alse). Nothing is done with the records that don’t meet the criteria and that portion of the workflow ends. The rows that meet the condition are passed to the ‘Join’ Tool.

An ‘Input Data’ tool is used to connect to the server and access the reference table where the valid workers’ comp codes are assigned to job classes. A SQL statement is used to accomplish this and is entered in the ‘Table or Query’ configuration option:
Select distinct wkr_comp From clstable

A list of distinct values is returned as the same code could be assigned to more than one job class. The resulting rows are passed to the ‘Join’ Tool.
A ‘Join’ tool combines the output from the previous steps and has three possible outputs. ‘Join by Specific Fields’ is selected in the configuration and the common field from the database table is associated to the common field from the employee information (workers’ comp code).


The ‘Left’ output contains the database reference table values that don’t match any of the employee records. Nothing is done with these records and this portion of the workflow ends.

The ‘Right’ output contains the employee records with a workers’ comp code that has not been assigned to any job class records. Since pay run processing is based on values in the job class table, if codes assigned to employees are NOT assigned to any job class records, no processing occurs and having employees assigned to these codes is irrelevant. These records will be displayed for the user to review.

The ‘Join’ output contains the employee records with a workers’ comp code that has been assigned to at least one job class. These records can be used to update the database table.

An ‘Output Data’ tool is used to update the database table. The server information and destination table are hard-coded in the configuration. The ‘Output Options’ value is set to ‘Update: Warn on Update Failure’ and the ‘Append Field Map’ uses a ‘Custom Mapping’ to assign the workflow fields to the corresponding table fields, as shown:

A ‘Sort’ tool is used to sort the records needing review by the user since their workers’ comp codes didn’t exist in the job class table. They are sorted by workers’ comp code then employee number.


A ‘Table’ tool is used to set up the layout of the records for use in a report. ‘Table Mode’ is set to Basic in the configuration and ‘Show Column Headings’ is selected. The workers’ comp code is displayed first, then the employee number.


A ‘Report Header’ tool is used to create a title for the report. The date and time is included in the header for this report, but no logo is added.


A ‘Layout’ tool is used to order the output on the report. The ‘Layout Mode’ is set to ‘Each Individual Record’ in the configuration and the ‘Per Row Configuration’ specifies the ‘Header’ tool is displayed before the ‘Table’ tool.


A ‘Folder Browse’ tool asks the user where the report should be saved. Text is displayed that instructs the user on what to do.

An ‘Action’ Interface Tool is used to tell Alteryx what to do with the information. In this case, the action type is ‘Update Value (Default)’. The ‘Replace a specific string’ option on the configuration is checked and the file path of the default file name for the report is entered in the text box. This information will be replaced by what the user has selected.



A ‘Render’ tool is used to create the PDF report of the records needing review by the user. The ‘Output Mode’ in the configuration is set to ‘Choose a Specific Output File’ and a default file name including a directory path is entered. The directory path will be replaced by what the user selected in the previous step,                                                       but the file name entered will be used as-is.


Because Interface Tools are used, Alteryx automatically sets the workflow configuration to ‘Analytic App’. (Clicking anywhere on the canvas – not a tool – displays the workflow configuration on the left.)

When the app is run by the user, a dialog box displays:

The user selects both the spreadsheet of employee information to import as well as the directory to save the PDF report in.

Clicking the ‘Finish’ button runs the workflow. If there are records needing review by the user, this displays when the workflow ends:

Clicking ‘OK’ opens the report. The user can also click the ‘Show Output Log’ link to see the number of records processed at each step.

If there are no records needing review by the user, this displays when the workflow ends:

The user is returned to the main dialog screen where they can run the workflow again with a different input file if desired, or click the ‘Exit’ button to close the app.

Using Alteryx to create a workflow allows the user to update data in a table without knowing any SQL. They can run the process multiple times if they decide to change or add codes after the initial run. Built-in data checking can also prevent errors in data later.

For additional information, please contact us at marketing@capitalizeconsulting.com!

Vendor Spotlight Webinar: DataRobot



Automated Machine Learning for Predictive Modeling

Originally Recorded on Wednesday, July 11, 2018 @ 12:00 pm CDT

What should you be doing with AI and Machine Learning?

How are your competitors leveraging predictive models?

According to Forrester Research, AI-driven companies will take $1.2 trillion from competitors by 2020. In the age of artificial intelligence (AI) and big data, organizations must embrace and leverage new automated machine learning technologies to build a competitive advantage and succeed.

In this vendor spotlight webinar, attendees will learn:

  • How machine learning and AI are transforming the way business is done
  • How to implement machine learning initiatives without hiring a large team of difficult-to-find data scientists
  • The basics of automated machine learning and how it enables organizations to make better, faster decisions that result in tangible business value
  • A demo of the DataRobot automated machine learning platform

To keep from falling behind, watch our webinar Automated Machine Learning for Predictive Modeling.

Watch Now

ThoughtSpot 4.5 Now Available!

This release lets you view trending and popular content on a new Smart Homepage, compare measures with a single keyword, perform advanced time-series analysis, and a whole lot more.

Check out some of the highlights of this release:

Smart Homepage
Easily access trending, popular content

Comparison Analysis
Compare measures with a single key word

Enhanced Time Series Analysis

Pivot Tables & Localization
Country Maps, Pivot Table Formatted Reporting, Localization


There’s a whole lot more included in ThoughtSpot 4.5. See a demo today and experience the next generation analytics platform that lets you use search to analyze your data and get automated insights with a single click.

For more information, please contact marketing@capitalizeconsulting.com

Capitalize Oil & Energy Lunch n’ Learn WebEx 3-Week Series

Enjoy our 3-week Oil & Energy Series which shows how to launch your journey to self-service analytics. In this series, you’ll learn how to work in Alteryx Designer, easily transition from Excel columns and rows to repeatable workflows, best practices in prepping, parsing, and blending your data with Alteryx, followed by a dose of next-level analytics—a dive into statistical, predictive, prescriptive, and spatial models.

Week 1 of 3 – Applying Alteryx to Energy Specific Data Challenges

Originally presented on June 14th,  2018

This shows the ability of Alteryx to take the daily data wrangling challenges and automates them to allow for more scalable initiatives
Discuss joining multiple data files into a more complete data set for deeper analytics
Discuss from a user perspective operational workflow and use cases
Discover how Alteryx can help take your analytics to the next step
Watch Now

Week 2 of 3 – How Alteryx Enables Predictive Analytics with SCADA/Time Series Data

Originally presented on June 21st,  2018
Demonstrate the ability to ingest, clean and wrangle sensor data and build toward trend analysis and predictive modeling
Discuss the Time Series and SCADA operational workflows and use cases
Tackling the sensor data challenge
Perform trend analysis and predicative analytics within the Alteryx platform
Watch Now

Week 3 of 3 – Adding Geospatial Components to your Data with Alteryx

Originally presented on June 28th,  2018

Discuss operational methane gas leak detection and analysis use case
Why Geospatial matters
Demonstrate geospatial capabilities within Alteryx
Watch Now

Upcoming Webinar! Vendor Spotlight: Agile Upstream

Know More with AI: Let the Software Read Your Leases and Agreements

Wednesday, June 06, 2018, 12:00 pm – 1:00 pm CDT

Finding critical information in lease contracts is a tedious process and can be costly even if one single provision is missed.

Agile Upstream has developed an Artificial Intelligence platform providing E&P land departments the ability to quickly upload lease documents, extract critical data, and review and validate vital information. A process that typically takes months can be completed in days, allowing for decreased financial risk and liability, and improved operational efficiency.

Join Capitalize Analytics and Agile Upstream to hear how Agile is transforming the way information is managed and leveraged in upstream and midstream.

Ideal for: Land professionals, Technology and Strategy Experts at E&P and midstream companies

Key takeaways: Hear how cutting-edge technology is quickly surfacing the critical data in contracts and lease documents needed to make accurate strategic decisions.

After registering, you will receive a confirmation email containing information about joining the webinar.

Can’t make it? Still register and we’ll send you the recording!


NAPAC 2018

This week, Capitalize will attend and exhibit at NAPAC, the North American Petroleum Accounting Conference, Thursday, May 17 & Friday, May 18 in Dallas, Texas at the Westin Galleria.

We look forward to seeing old friends and introducing ourselves to new ones!

Stop by and see us!


O&G Tax Calculation: A Cognos Success Story

When it comes to taxes, the US government can be particular about how they want taxes prepared and calculated. In the state of Louisiana, taxes are applied to the physical well’s volume and sales allocated, and not to the tract. A tract is an ownership percentage of one-to-many physical wells.

The problem: Our client has an allocation system that applied taxes to the tracts and not to the physical wells. The client attempted to take those tract results back into the physical wells, which was proved to be difficult and time-consuming. Complicating matters further, two sets of physical wells were added together and then split 25/75 based on contract terms, and some were Take-in-Kind contracts. This led to too many errors in taxes paid.

The client also had to deal with paying on six different types of taxes. Three of them were volumetric-based, while the remaining three were based on sales value. Depending on the exemption status of the physical well, applying three tiers of rates depending on exemption certificate was also needed. This led to 12 different tax entries for each owner tract settled.

The solution: The allocation system that the client used already contained all the information we needed to produce the required result: physical wells’ volume, the rates, liquid allocation, and meter type to determine which rate to use.

We created a report that would calculate the information we needed and be easily loaded via template into the client’s database. This report calculated the taxable volume for taxes paid on volume per physical well and on calculated sales condensate values. Each meter was broken out by the six taxes charged in the report to be uploaded. Unlike the tracts, physical wells could only have six different charges of tax on Gas and Liquids, while tracts could have up to 18.

We created another report to help the tax department determine which physical meters needed to apply for tax exemption or reduction based on flow hours. Lower flow hours allowed them to apply for a reduction in tax rates in a timelier manner.

The Conclusion: These new reports helped the tax department process tax filings faster and created a more efficient means of communication between the tax department and accounting pertaining to the application of exemption to physical wells. The solution also prompted the client to invest resources in accounting software that mimics the report function in the application itself, by calculating the taxes before dividing them into tracts.

For additional information, please contact us at marketing@capitalizeconsulting.com!

Upcoming Higher Education Webinar: Five Cognos Tips Every Banner Client Needs to Know!

If your institution has Banner and you want to take full advantage of Cognos, then this is the webinar for you!

Capitalize Analytics is an IBM premier business partner focused on helping higher education get real meaning from their data. We understand today’s institutions are pulled in multiple directions and challenged to meet higher expectations with fewer resources.

In this webinar, we will demonstrate:

1. Dynamic Dashboarding: Explore where your departments are spending their budgets.
2. Geographic Dashboarding: Use maps to compare the high school performance of admitted      students to the campus average.
3. Drill Through Dashboard to Reports: Discover student retention demographics by course        and department with drill through reports.
4. Communicating Information through Bursting: Burst your budget reports so college                deans know where they stand financially.
5. Getting Notified about Changes in Status from Event Studio: Notify your financial aid            students when they fail multiple classes.

Attend our webinar and let us show how you can get the most out of Cognos for Banner!


After registering, you will receive a confirmation email containing information about joining the webinar.