Tag Archives: Machine Learning

Comparison of Pre vs AI Data Processing
Thi

s document provides a comparative analysis of data processing methodologies before
and after the integration of Artificial Intelligence (AI). It highlights the key components and
steps involved in both approaches, illustrating how AI enhances data handling and analysis.
Lower Accuracy
Level
Slower Analysis
Speed
Manual Data
Handling
Pre-AI Data Processing
Higher Accuracy
Level
Faster Analysis
Speed
Automated Data
Handling
Post-AI Data
Processing
AI Enhances Data Processing Efficiency and Accuracy
Pre AI Data Processing

  1. Profile Source: In the pre-AI stage, data profiling involves assessing the data sources
    to understand their structure, content, and quality. This step is crucial for identifying
    any inconsistencies or issues that may affect subsequent analysis.
  2. Standardize Data: Standardization is the process of ensuring that data is formatted
    consistently across different sources. This may involve converting data types, unifying
    naming conventions, and aligning measurement units.
  3. Apply Reference Data: Reference data is applied to enrich the dataset, providing
    context and additional information that can enhance analysis. This step often involves
    mapping data to established standards or categories.
  4. Summarize: Summarization in the pre-AI context typically involves generating basic
    statistics or aggregating data to provide a high-level overview. This may include
    calculating averages, totals, or counts.
  5. Dimensional: Dimensional analysis refers to examining data across various dimensions,
    such as time, geography, or product categories, to uncover insights and trends.
    Post AI Data Processing
  6. Pre Component Analysis: In the post-AI framework, pre-component analysis involves
    breaking down data into its constituent parts to identify patterns and relationships that
    may not be immediately apparent.
  7. Dimension Group: AI enables more sophisticated grouping of dimensions, allowing for
    complex analyses that can reveal deeper insights and correlations within the data.
  8. Data Preparation: Data preparation in the AI context is often automated and enhanced
    by machine learning algorithms, which can clean, transform, and enrich data more
    efficiently than traditional methods.
  9. Summarize: The summarization process post-AI leverages advanced algorithms to
    generate insights that are more nuanced and actionable, often providing predictive
    analytics and recommendations based on the data.
    In conclusion, the integration of AI into data processing significantly transforms the
    methodologies

Chocolate cake, MDM, data quality, machine learning and creating the information value chain’

The primary take away from this article will be that you don’t start your Machine Learning project, MDM , Data Quality or Analytical project with “data” analysis, you start with the end in mind, the business objective in mind. We don’t need to analyze data to know what it is, it’s like oil or water or sand or flour.

Unless we have a business purpose to use these things, we don’t need to analyze them to know what they are. Then because they are only ingredients to whatever we’re trying to make. And what makes them important is to what degree they are part of the recipe , how they are associated

Business Objective: Make Desert

Business Questions: The consensus is Chocolate Cake , how do we make it?

Business Metrics: Baked Chocolate Cake

Metric Decomposition: What are the ingredients and portions?

2/3 cup butter, softened

1-2/3 cups sugar

3 large eggs

2 cups all-purpose flour

2/3 cup baking cocoa

1-1/4 teaspoons baking soda

1 teaspoon salt

1-1/3 cups milk

Confectioners’ sugar or favorite frosting

So here is the point you don’t start to figure out what you’re going to have for dessert by analyzing the quality of the ingredients. It’s not important until you put them in the context of what you’re making and how they relate in essence, or how the ingredients are linked or they are chained together.

In relation to my example of desert and a chocolate cake, an example could be, that you only have one cup of sugar, the eggs could’ve set out on the counter all day, the flour could be coconut flour , etc. etc. you make your judgment on whether not to make the cake on the basis of analyzing all the ingredients in the context of what you want to, which is a chocolate cake made with possibly warm eggs, cocunut flour and only one cup of sugar.

Again belaboring this you don’t start you project by looking at a single entity column or piece of data, until you know what you’re going to use it for in the context of meeting your business objectives.

Applying this to the area of machine learning, data quality and/or MDM lets take an example as follows:

Business Objective: Determine Operating Income

Business Questions: How much do we make, what does it cost us.

Business. Metrics: Operating income = gross income – operating expenses – depreciation – amortization.

Metric Decomposition: What do I need to determine a Operating income?

Gross Income = Sales Amount from Sales Table, Product, Address

Operating Expense = Cost from Expense Table, Department, Vendor

Etc…

Dimensions to Analyze for quality.

Product

Address

Department

Vendor

You may think these are the ingredients for our chocolate cake in regards to business and operating income however we’re missing one key component, the portions or relationship, in business, this would mean the association,hierarchy or drill path that the business will follow when asking a question such as why is our operating income low?

For instance the CEO might first ask what area of the country are we making the least amount of money?

After that the CEO may ask well in that part of the country, what product is making the least amount of money and who manages it, what about the parts suppliers?

Product => Address => Department => Vendor

Product => Department => Vendor => Address

Many times these hierarchies, drill downs, associations or relationships are based on various legal transaction of related data elements the company requires either between their customers and or vendors.

The point here is we need to know the relationships , dependencies and associations that are required for each business legal transaction we’re going to have to build in order to link these elements directly to the metrics that are required for determining operating income, and subsequently answering questions about it.

No matter the project, whether we are preparing for developing a machine learning model, building an MDM application or providing an analytical application if we cannot provide these elements and their associations to a metric , we will not have answered the key business questions and will most likely fail.

The need to resolve the relationships is what drives the need for data quality which is really a way of understanding what you need to do to standardize your data. Because the only way to create the relationships is with standards and mappings between entities.

The key is mastering and linking relationships or associations required for answering business questions, it is certainly not just mastering “data” with out context.

We need MASTER DATA RELATIONSHIP MANAGEMENT

not

MASTER DATA MANAGEMENT.

So final thoughts are the key to making the chocolate cake is understanding the relationships and the relative importance of the data/ingredients to each other not the individual quality of each ingredient.

This also affects the workflow, Many inexperienced MDM Data architects do not realize that these associations form the basis for the fact tables in the analytical area. These associations will be the primary path(work flow) the data stewards will follow in performing maintenance , the stewards will be guided based on these associations to maintain the surrounding dimensions/master entities. Unfortunately instead some architects will focus on the technology and not the business. Virtually all MDM tools are model driven APIs and rely on these relationships(hierarchies) to generate work flow and maintenance screen generation. Many inexperienced architects focus on MVP(Minimal Viable Product), or technical short term deliverable and are quickly called to task due to the fact the incurred cost for the business is not lowered as well as the final product(Chocolate Cake) is delayed and will now cost more.

Unless the specifics of questionable quality in a specific entity or table or understood in the context of the greater business question and association it cannot be excluded are included.

An excellent resource for understanding this context can we found by following: John Owens

Final , final thoughts, there is an emphasis on creating the MVP(Minimal Viable Product) in projects today, my take is in the real world you need to deliver the chocolate cake, simply delivering the cake with no frosting will not do,in reality the client wants to “have their cake and eat it too”.

Note:

Operating Income is a synonym for earnings before interest and taxes (EBIT) and is also referred to as “operating profit” or “recurring profit.” Operating income is calculated as: Operating income = gross incomeoperating expenses – depreciation – amortization.