Go to Top

Capitalize Analytics Blog

Capitalize is BuyBoard Approved!

We are excited to announce the Local Government Purchasing Cooperative has awarded Capitalize Analytics a BuyBoard contract effective January 1, 2019!

As an approved BuyBoard vendor, Capitalize can now sell software and services to over 6,000 state agencies, local agencies, universities, and school districts across the US.

We are approved for the following software products & services:

  • IBM (Cognos, TM1/Planning Analytics, SPSS)
  • Alteryx
  • ThoughtSpot
  • Software Support Maintenance Agreements
  • Technology Staff Development and Training
  • Installation & Repair Service
  • All Types of IT Position(s)

For more information, please contact marketing@capitalizeconsulting.com.

We look forward to supporting you in 2019 & beyond!

Tools of the Trade: Which BI Tool Is Best?

As consultants, we are always being asked “Which tool should I use for BI/Analytics?” For better or worse, the answer isn’t simple. It’s a bit like saying “I have to cut some wood. What saw should I use?” To answer that question, we would need to know:

  • What kind of wood are you cutting?
  • What are the dimensions?
  • What are you doing with the wood that requires it to be cut?
  • Is the wood mobile, in a fixed place, hard to reach?
  • Are you just cutting it in half or are you doing something intricate?

Once we know the answers to those questions, a woodworking expert may say “Compound Miter Saw!”

Does that mean we should throw out all other saws and forever use a Compound Miter Saw? NOPE!

And so the story goes with BI. We need to know things like:

  • What type of reports do you need to build today?
  • Do you see the need for different functionality in the future?
  • Do you need to be able to build dashboards?
  • Who will be creating reports and what is their technical background?
  • Will you be doing anything that needs to be highly formatted in PDF?
  • How will people get to the reports?
  • What tools/infrastructure do you have today?
  • Is this for analysts, IT, end users?

There are dozens of questions we may ask.

What platforms and solutions do organizations need?

In the BI space we currently see a few “core” solutions: Enterprise Reporting, Dashboards/Visualizations, Predictive/Machine Learning/AI. Every organization is different. We need to understand the current state, technologies available, difficulties with the current environment, and goals of the organization before we start throwing technology at the problem.

Where do we start?

No company can run without enterprise reporting. You need financial statements, regulatory reports, invoices, etc. and those require distribution inside and outside the organization. Every ERP, CRM, and operational system has built-in reports, but having a single place to go for cross-system reporting, custom reports, etc. is essential. This is where tools like Cognos, Business Objects, SSRS, and others like them are incredibly important.

These tools can build dashboards, scorecards, and include web service APIs, but much of the time these tools are driven by IT and require a bit of technical knowledge to utilize fully and correctly. They may not be as “agile” as your departmental users and analysts would like!

Ideally, in your enterprise reporting solution, you’ll have something that has built-in enterprise security and a reusable data model. There will be change management ensuring published items can be trusted, governed, etc. Your CFO doesn’t want anyone changing up the P&L after he or she signed off on it!

Enterprise reporting is your Compound Miter Saw. It’s required.

When do we consider something else?

Sometimes we need something more agile, something that we can grab quickly, wherever we are, and do a quick job. Maybe you’re an analyst, you don’t have a “server,” and you probably don’t know much about metadata. You just want to install something on your computer and get stuff done! Hopefully you’ve realized that while Excel can do practically ANYTHING, it’s also a very manual and cumbersome process. For that, we may turn to our friends at Tableau, PowerBI, or similar agile analytics tools.

These companies have made names for themselves by giving non-IT people tools that are more effective and efficient than Excel in many ways but don’t take a lot of effort or background knowledge to get started. You download a trial, grab some data, and you are slicing, dicing, and visualizing!

Your teams in marketing, accounting, engineering, etc. all have data and they STRUGGLE with it! Many of those organizations are dumping data from your enterprise reporting solution (that they hate) and then jamming it in Excel. They spend two days trying to cobble that data together, so they can get an answer for the sales team, controller, a client, etc. They live in Excel Hell.

When those analysts get a hold of a tool that cuts a two-day job down to a couple of hours, they rejoice! No longer do they need to download, copy-paste, VLOOKUP, SUMIF, pivot, and export to PowerPoint. They use purpose-built analyst tools to get the job done.

These analysts need something handheld that can get into tight places, cut curves, etc. They found a Jig Saw and they are thrilled with it.

Problem solved?

Everyone is happy right? Not exactly. The IT department is FREAKING OUT! “We have crazy people out in the line of business trying to chop down trees with a JIG SAW! It’s not SAFE! It’s not SECURE! STOP EVERYTHING! You are ROGUE IT and you’re putting the entire company AT RISK with your shenanigans!”

On the other hand, the new rogue IT team/analysts are having a party because they never need to speak to IT again because they can DO ANYTHING now!

This polarization is running rampant at most organizations right now. The battle between enterprise IT and rogue IT is strong and it’s unhealthy. We must stop working against each other and realize that there are times when both strategies are appropriate. I believe the olive branch needs to be extended from IT.

Why is IT the one in the driver’s seat?

Because line of business knows IT as the team of “no.” “No you can’t have access to the database. No, we can’t build what you need this month. No, you can’t store those files on our server. No, we can’t give you remote access. No, no, no.”

So, the line of business finds a way. Coming in and telling them that “No, you can’t use the solution you found since we didn’t help you” is not the answer.

The best IT organizations we work with right now are doing everything they can to enable their data-hungry departments and individuals by giving them tools they can collaborate with on. That may mean bringing in something that isn’t “on your approved enterprise application list” but it’s WAY better than the crazy things going on in Excel behind your back!

Forward thinking IT departments are challenging their business unit counterparts to push the boundaries of automation, process improvement, and data analysis. They give them tools like Alteryx, Tableau, PowerBI, that empower the departments and give them a way out of the Excel nightmare.

But what about security, enterprise standards, etc.?

We need those too. We need a framework that embraces agile tools (jig saws) when they are the best tools for the job, and also supports enterprise tools (compound miter saws) when the solution calls for it.

We have discovered another amazing thing that happens when the users use these agile tools in their departments to build solutions. They are creating REQUIREMENTS! Think about it; when have you ever been able to pin down a user to create real requirements documents with legitimate use cases and test cases? NEVER! Now they are out building a working solution that gets them the data and answers they need!

If IT looks at the solution the user created, and it is something that needs to be turned into an enterprise class solution, just port over the solution the user created. IT can then add security, auditing, change control, and any other company standards. If done well, the users should be happy to use something that is more automated, scheduled, and bullet-proof that they don’t have to worry about supporting!

There are many saws and many analytics tools out there. Rather than saying one is the best and the rest are unacceptable, let’s think a little more openly. The best tool out there is the one that actually gets used and gets the job done. A tool that sits on the shelf while people do things with hand saws/Excel is a waste of money!

Conclusion and call to action.

Your organization is here to make money, teach students, cure cancer…not to fight over tools and internal fiefdoms. Let’s work together to advance the organization and let the tools enable that mission.

Capitalize Analytics is an organization passionate about helping people use data more effectively and efficiently. We search out and destroy manual processes, data challenges, integration problems, etc. We take the time to understand the entire problem and select tools purpose-built to resolve it.

Give us a call and let’s build a collaborative enterprise and agile data analytics community at your organization!

Upcoming Oil & Gas Webinar! DataHub: How to Trim Months Off Your System Implementation & Asset Migration

Monday, December 10, 2018 @ 12:00 PM CST

Asset migration and reporting is painful! DataHub can help!

If you’ve ever implemented a revenue accounting or land system, or converted from one system to another, you know the process is PAINFUL! It can take months to manually set up assets in a new system and it can be very error-prone.

Capitalize has implemented and converted hundreds of clients and assets over the past 13 years. We’ve improved each process every time, first with scripts to accelerate implementation, and now via a platform we have developed to trim months off your acquisition, divestiture, or system implementation.

That platform, called DataHub, is a “universal translator” that:

1) Automates the movement of master and transactional data from one system to another
2) Allows you to easily migrate single or multiple assets from your Test environment to Production
3) Creates a multi-data source “data warehouse” for cross-system reporting

DataHub allows us to move data from one accounting system to another (examples include Waterfield, Quorum, or even Excel spreadsheets) in a fraction of the time normally required. We can also combine your measurement, revenue, and financials into single reports and dashboards for a single view of your operation.

This webinar is ideal for oil & gas professionals working with revenue or land accounting software.

If you know the pain of implementation, migration, and reporting we can help! Join our webinar on December 10, 2018 and let us give you a deeper dive into what DataHub is and how it can save your organization both time and money!

Can’t make it? Still register and we’ll send you the recording!

Register Now

UPCOMING WEBINAR! Cognos 11.1 is a game changer! Let us show you why!

IBM recently released Cognos Analytics 11.1 and the buzz is in the air. Many in the industry feel this release is MAJOR in both added functionality and value. With over 20 years of experience in the analytics space, the Capitalize team agrees!

Please join us on Tuesday, November 6th at 1:00 pm EST where we give viewers a straightforward, in-depth look and demonstration of the new content and features including:

  • Data exploration guided by artificial intelligence (AI)
  • Increased geospatial capabilities
  • Automatic visualization recommendations
  • Ability to reuse content for quick, easy creation of new dashboards, reports, etc.
  • Stunning and fast visualizations
  • Simplified data preparation
  • Expanded machine learning processing

This webinar is intended for people interested in learning more about IBM Cognos and current users. We’ll even cover licensing and pricing!

Can’t make it? No worries! Go ahead and register and we’ll make sure to send you the recording.



Discount Codes for ThoughtSpot Beyond 2018 Conference

ThoughtSpot: Search & AI-Driven Analytics Platform for Humans

  • The world’s first relational search engine.
  • Get automated insights in a single click.
  • Data insights you can trust.
  • Maximum sources. Minimum modeling.
  • Get up and analyzing fast.
  • Scale enterprise-wide to everyone.

ThoughtSpot is holding their first conference, Beyond 2018, on November 13-15, 2018 in Washington DC. It promises to be The Enterprise Data & Analytics Event of the Year – you won’t want to miss it!

Hear from analytics thought leaders like Billy Beane, former General Manager of the Oakland Athletics and author of “MoneyBall;” Jaya Kolhatkar, Chief Data Officer of Hulu, and many more. Learn about cutting-edge trends in analytics and hear directly from enterprises actively transforming their businesses with advanced analytics.

ThoughtSpot has generously provided Capitalize Analytics with a discount code to share with those interested in attending. Codes are limited, so if you want to attend Beyond 2018, please contact us at marketing@capitalizeconsulting.com right away to take advantage!

For more information about ThoughtSpot:  https://www.thoughtspot.com/

For more information about Beyond 2018:  https://www.thoughtspot.com/beyond2018

Upcoming Webinar: Empower Your K12 Departments with Dashboarding and Analytics Using Tableau

Thursday, September 20, 2018 1:00 PM CDT

School districts have analysts in nearly every department. They analyze attendance, assessments, enrollment, performance, curriculum, financials, – the list is endless. With Tableau, you no longer have to rely on IT to create every report or dashboard.

In this webinar, we’ll show you how districts are using data from Excel, the SIS, and ERP to quickly answer ad-hoc questions and how you can build dashboards to enable principals, superintendents, school board members, etc. to make decisions with data-driven information.

Join us on September 20th to learn more!

After registering, you will receive a confirmation email containing information about joining the webinar.

Can’t make it? Still register and we’ll send you the recording!


Upcoming Alteryx Higher Education Webinar

Alteryx: How Higher Ed Institutions Nationwide Are Simplifying Data Access, Automating Processes, and Forging Ahead with Predictive Analytics!

Higher education has so much data from the registrar’s office, admissions, financial aid, and institutional advancement, just to name a few. Alteryx helps institutions wrangle all this data by enabling higher ed departments and personnel to prep, blend, and analyze multiple data sources from across the university to help answer their most pressing questions and predict outcomes.

In this webinar, attendees will learn how Alteryx helps with:

Data integration to keep up with the ever-changing student, staff, and overlapping data changes.
Process automation to identify at-risk students and notify advisors and other applicable departments.
Predictive analytics to forecast student enrollment after application & acceptance. Alteryx’s geospatial capabilities will also be highlighted in this topic.

While these are just a few examples, Alteryx is relevant to everyone in higher education. If data integration takes up too much of your time, if you repeat the same steps in Excel each month, or if predictive analytics has moved to the top of your to-do list, then join us on Wednesday, September 26, 2018 at 12 PM CT to let us show you how Alteryx is the answer!

After registering, you will receive a confirmation email containing information about joining the webinar.

Can’t make it? Still register and we’ll send you the recording!


Invitation: North Texas IBM Cognos User Group Meeting

Capitalize Analytics is sponsoring the next North Texas IBM Cognos User Group Meeting on Thursday, August 16, 2018. We, along with IBM, would like to invite you to attend!

The meeting will run from 8:30 am to 1:00 pm with a continental breakfast and lunch provided.

Some of the topics covered will be:

  • Cognos Analytics Tips and Tricks: Use Cognos Analytics to Update Databases, Call Webservices, & Send Emails with Multiple Attached Documents/Reports by Capitalize Analytics
  • Cognos Analytics at Scale: How Cognos Analytics is Enabling Success at 100s of School Districts Nationwide
  • Cognos Analytics in a Diverse Technology World: How Other BI/ETL/Predictive Technologies Work with Cognos as Part of an Enterprise Data Analytics Strategy
  • IBM Watson Studio: Build/Train AI Model in One Integrated Environment

To register, please click here:

Registration Link

The meeting will take place at the IBM offices located at Beltline in Coppell.

1177 South Belt Line Road

Coppell TX 75019-4652

*Enter through the customer entrance – the front doors facing Beltline.

We look forward to seeing you there!

DataRobot: Capitalizing on the AI/Machine Learning Revolution

Here at Capitalize, we continually search out emerging technologies that are revolutionizing operations and insights for our customers. For the past several years, leaders in their respective industries have been focused on digitalization and improving infrastructures. These initiatives have been purpose-built to arm their data scientists with the information required for advanced analytics.

In fact, Forrester published an article in 2017 about artificial intelligence (AI) and how it will drive the Insights revolution. They have predicted that “AI companies will take $1.2 trillion from competitors by 2020.” Companies who fail to embrace this new revolution will be faced with a steep challenge ahead to maintain market share.

With so many different use cases and potential for AI and Automated Machine Learning across every industry, Capitalize Analytics wants to make sure we find and deliver technologies that position our customers for success. We search for easy-to-use, quick-to-implement, reliable products to recommend to our clients interested in placing themselves at the head of the artificial intelligence revolution.

Recently we reached out to DataRobot, an artificial intelligence and Automated Machine Learning platform with a vision for enabling data-driven organizations to embrace this innovation. Their goal is to make AI easy to use, easy to understand, and easy to apply, and to make it applicable to every business process to predict outcomes. This allows the data-driven enterprise to adapt to new conditions at incredible speeds and continually self-optimize based on predictions of their future state.

While AI is conceptually appealing, it does present a few challenges at the start, the most obvious of which is staffing. Organizations today are faced with a lack of experienced data scientists on staff, due to high demand and low supply in the job market. Today, predictive initiatives require individuals possessing a blend of:

  • Domain Expertise with knowledge of the business and its issues and most importantly, knowledge of the available data.
  • Programming Skills including the ability to write code which is necessary to gather, interrogate, manipulate the data to extract actionable insights and build and implement the proper models.
  • Advanced Mathematics and Statistics Skills with knowledge of applicable algorithms and the experience to interpret and explain their findings.


We were excited to learn that with their Automated ML platform, DataRobot reduces dependency on this difficult-to-find programming, mathematical, and statistical expertise. This reduced dependency on internal resources frees up the business to gain greater insight through:

  • Accessibility, requiring little to no data science experience required to get started.
  • Speed, training, and testing hundreds of models in a fraction of the time it takes the average data scientist to create one model.
  • Transparency, this is not a black box environment enabling users to see everything that is happening under the hood.
  • Mass Model Production, speed of model creation turns your users into a mass-producing model factory.
  • Deployment, no re-coding is required to deploy models, and this deployment can be completed in a manner of minutes.


But what good does this predictive technology do without the ability to combine, visualize, distribute, and understand its findings? Integrating with today’s data technology leaders, DataRobot helps bridge several solutions together to develop an advanced analytics ecosystem. They do this by combing data from on-premises big data warehouses like Cloudera and Hortonworks; utilizing Cloud analytics platforms like AWS, Azure, and Google cloud; then leveraging data preparation tools such as Alteryx, Trifecta, and Paxata.

The final step in delivering these predictions to the organization is accomplished through integrations with data visualization tools such as Qlik and Tableau. In this way, DataRobot can truly make everyone in the organization a “Citizen Data Scientist.”


To read what Forbes is saying about DataRobot click here: DataRobot Puts The Power Of Machine Learning In The Hands Of Business Analysts – Forbes

To watch a recording of our webinar and see a brief demonstration of this powerful platform click here: Capitalize Vendor Spotlight: DataRobot


Alteryx Solution for Updating Workers’ Compensation Code on Employee Records

There are many times when we need to get large amounts of data into a system, but the front end of the application doesn’t provide a way to do it quickly. This creates a dilemma of how to get hundreds or thousands of items loaded, without doing it by hand, which could take days. Alteryx is a solution that makes data movement, blending, and loading quick and easy. It is also a solution that requires ZERO SQL, or other coding skills so more users can be part of the process.

During a new implementation of PowerSchool’s eFinancePLUS, a district wanted to use the system’s ability to calculate the workers’ compensation premiums during the pay run processing. They created the appropriate codes and assigned them to the right job classes for the processing to occur. However, the codes were not able to be added en masse to individual employee records for later use in running informational reports.

Because the district had added the codes to employee records in a spreadsheet, an Alteryx workflow was created to update the appropriate table in the database with the assigned codes. An explanation of the workflow process follows.

The actual SQL server/database connection information is hard-coded in the workflow.

The demographic conversion spreadsheet with employee information has many columns, but it must have the [EMPLOYEE NUMBER] and [WORKCOMP] columns. It can only contain one sheet.

Since the directory path and name of the spreadsheet could vary, a ‘File Browse’ Interface Tool is used to ask the user to locate the desired file. Text is displayed that instructs the user on what to do.


An ‘Action’ Interface Tool is used to tell Alteryx what to do with the information. In this case, the action type is ‘Update Input Data Tool (Default)’ and it is required.


An ‘Input Data’ tool is used to define information about the spreadsheet data. The default name assigned in the configuration will be replaced with the file name selected by the user in the previous step.


A ‘Select’ tool is used to keep only the employee number and workers comp code from all the columns in the spreadsheet. The Size field of the workers comp code was changed from the default setting of 255 to 4, as that’s how it’s defined in the database table.


A ‘Filter’ tool is used with a Custom Filter to keep only the rows where both the employee number and the workers comp code are not Null. The expression used is:


This tool has two outputs – T(rue) and F(alse). Nothing is done with the records that don’t meet the criteria and that portion of the workflow ends. The rows that meet the condition are passed to the ‘Join’ Tool.

An ‘Input Data’ tool is used to connect to the server and access the reference table where the valid workers’ comp codes are assigned to job classes. A SQL statement is used to accomplish this and is entered in the ‘Table or Query’ configuration option:
Select distinct wkr_comp From clstable

A list of distinct values is returned as the same code could be assigned to more than one job class. The resulting rows are passed to the ‘Join’ Tool.
A ‘Join’ tool combines the output from the previous steps and has three possible outputs. ‘Join by Specific Fields’ is selected in the configuration and the common field from the database table is associated to the common field from the employee information (workers’ comp code).


The ‘Left’ output contains the database reference table values that don’t match any of the employee records. Nothing is done with these records and this portion of the workflow ends.

The ‘Right’ output contains the employee records with a workers’ comp code that has not been assigned to any job class records. Since pay run processing is based on values in the job class table, if codes assigned to employees are NOT assigned to any job class records, no processing occurs and having employees assigned to these codes is irrelevant. These records will be displayed for the user to review.

The ‘Join’ output contains the employee records with a workers’ comp code that has been assigned to at least one job class. These records can be used to update the database table.

An ‘Output Data’ tool is used to update the database table. The server information and destination table are hard-coded in the configuration. The ‘Output Options’ value is set to ‘Update: Warn on Update Failure’ and the ‘Append Field Map’ uses a ‘Custom Mapping’ to assign the workflow fields to the corresponding table fields, as shown:

A ‘Sort’ tool is used to sort the records needing review by the user since their workers’ comp codes didn’t exist in the job class table. They are sorted by workers’ comp code then employee number.


A ‘Table’ tool is used to set up the layout of the records for use in a report. ‘Table Mode’ is set to Basic in the configuration and ‘Show Column Headings’ is selected. The workers’ comp code is displayed first, then the employee number.


A ‘Report Header’ tool is used to create a title for the report. The date and time is included in the header for this report, but no logo is added.


A ‘Layout’ tool is used to order the output on the report. The ‘Layout Mode’ is set to ‘Each Individual Record’ in the configuration and the ‘Per Row Configuration’ specifies the ‘Header’ tool is displayed before the ‘Table’ tool.


A ‘Folder Browse’ tool asks the user where the report should be saved. Text is displayed that instructs the user on what to do.

An ‘Action’ Interface Tool is used to tell Alteryx what to do with the information. In this case, the action type is ‘Update Value (Default)’. The ‘Replace a specific string’ option on the configuration is checked and the file path of the default file name for the report is entered in the text box. This information will be replaced by what the user has selected.



A ‘Render’ tool is used to create the PDF report of the records needing review by the user. The ‘Output Mode’ in the configuration is set to ‘Choose a Specific Output File’ and a default file name including a directory path is entered. The directory path will be replaced by what the user selected in the previous step,                                                       but the file name entered will be used as-is.


Because Interface Tools are used, Alteryx automatically sets the workflow configuration to ‘Analytic App’. (Clicking anywhere on the canvas – not a tool – displays the workflow configuration on the left.)

When the app is run by the user, a dialog box displays:

The user selects both the spreadsheet of employee information to import as well as the directory to save the PDF report in.

Clicking the ‘Finish’ button runs the workflow. If there are records needing review by the user, this displays when the workflow ends:

Clicking ‘OK’ opens the report. The user can also click the ‘Show Output Log’ link to see the number of records processed at each step.

If there are no records needing review by the user, this displays when the workflow ends:

The user is returned to the main dialog screen where they can run the workflow again with a different input file if desired, or click the ‘Exit’ button to close the app.

Using Alteryx to create a workflow allows the user to update data in a table without knowing any SQL. They can run the process multiple times if they decide to change or add codes after the initial run. Built-in data checking can also prevent errors in data later.

For additional information, please contact us at marketing@capitalizeconsulting.com!