Categories
Data Analytics

How Geographic Data is Used to Make the Best Business Decisions

While making important business decisions, companies, governments, and organizations are increasingly relying on geographic data to tailor their choices to circumstances on the ground. From deciding where to start new stores to determining whether groups in particular regions will respond favorably to new products, big data features heavily in the process. It’s important, however, to understand what geographic information is, why it is used, how it is utilized, and what the common applications for it are.

What is Geographic Data?

Thinking about data as it exists spread out in space often helps people – especially those who aren’t highly numerate – take something abstract and place it in a specific context. From a marketing perspective, geographic data allows you to think about the “where” factor when it comes to customers. Where are your established customers? Where is the audience you still need to reach?

Bundle together enough where-factors and you’ll soon get to the “what” questions. A company can use data analytics to ask questions like, “What is the business case for expansion of marketing in this region?”

With advances in artificial intelligence, it’s even possible to move on to questions of “who”. By developing a profile of the customers who respond well to your products, services, and messages, you can begin to ponder, “Who else matches the profile of our current customers?” With the right marketing approach, this can help a business open up new opportunities, increase revenue, and improve customer retention.

Why Use Geographic Data?

The reason for using geographic information is increasingly to gain an advantage against competitors. Firms that value data science have the potential to significantly outpace their competition. They can collect data in real time, process it through artificial intelligence applications, and present insight to decision-makers in a more timely manner.

Companies who fail to recognize the value behind the geographic details in their customer data will soon find themselves falling behind their competitors. Even for small or medium size businesses, this data is crucial for success. True success comes from growth, and healthy growth cannot come without making safe and smart business decisions by analyzing every single detail within geographic customer data.

Geographic Data is Big Data

A commitment to data analytics begins with making a choice about organizational culture. When a company wants to become a data-centric operation, it needs to onboard stakeholders, and in some case, offload those who resist digital transformation. Bringing in people who have data science backgrounds is an investment in new hiring, and it also entails purchasing the computing power they require to do their jobs well.

In terms of geographic marketing data, this approach calls for seeing almost everything as a data point to be gathered. When customers check out at stores, it’s common for companies to gather information in a variety of ways. In the simplest form, this includes asking direct questions, such as request the ZIP codes for where each customer lives. More advanced approaches include developing rewards programs that create opt-ins for tracking data about purchases and habits based on customer demographics.

Marketing Applications of Data Analytics

One of the increasingly common use cases for geographic data in the marketing world is targeted advertising. By combining opt-in data from your own sources, such as email marketing, web pages and apps, with data from vendors like Google and Facebook, you can micro-target advertisements to almost anyone who profiles as the type of customer you want to attract. If a business is attempting to identify young couples in a region where it has stores, it can utilize GPS-based data from an app to micro-target with such a level of precision that offers are only sent to them when they’re in close driving range of a physical store.

The logic of this can be taken in interesting directions. For example, a company that sells anti-frizz shampoo could target ads based on the current weather in an area. When the humidity goes up, advertising appeals can be deployed to let people know about the product at the precise moment they’re realizing they need a solution to a problem.

By developing a culture that values data science, a firm can open up new marketing opportunities. Information can be gathered rapidly, ideas can be tested swiftly, and marketing content can be updated on a minute-to-minute basis. This ultimately can make your firm more flexible in speaking to customers at times when and in places where they’ll be most receptive.

External Resources:

20 Ways GIS Data is Used in Business and Everyday Life

Categories
Big Data Data Analytics

Data Security is Crucial to Business Prosperity

Why Data Security is Crucial to Business Prosperity

In the age of big data, the amount of information that companies collect is unprecedented. All that data, however, presents a tempting target for hackers. Even for intruders who aren’t interested in grabbing customer or corporate data find value in getting past data security systems in order to take control of resources on networks. Data analysis is increasingly critical to financial success in a wide range of industries, and that means that companies need to be invested in protecting their customers and themselves.

Data Breaches

We tend to think of certain organizations, such as big credit card companies, as the main targets of hacking, but the bad guys aren’t disinclined to hit a smaller operation, especially if they feel they can get quickly in and out. Worse, the good guys are sometimes the ones who cause a breach. The Boston Medical Center discovered in March 2014 that 15,000 patient records had been exposed due to a failure of a third-party services provider. Records were accessible without a password, and the data includes names, addresses and medical data. More astonishingly, the incident was the product of poor practices and not an overt hacking attempt.

A 2016 report found that only about a quarter of businesses seem to have a full understanding of the challenges they face. Even the ones that consider themselves to be knowledgeable about and tough on security anticipate continued attempts at breaching their systems, regardless of their efforts. Recovery times from breaches are expected to be at least 8 weeks per incident.

Unusual Hacking Goals

Data breaches themselves aren’t the only thing hackers are interested in these days. In January 2018, an attack on Kaseya, a company that works with managed services providers, was discovered. No information was lost because the intruders weren’t even looking for it, even though they could’ve taken whatever they wanted. The hackers were instead taking over servers to use their processing power to make money in the booming cryptocurrency market. Only third-party data analysis of activity on the systems exposed what was happening.

Attack Vectors

Hackers are increasingly hitting targets from all directions, from installing viruses to sending fake emails hoping to get someone to give them a password. Others are using automated tools to brute force their way into protected systems.

One advantage of big data systems is that they can actually be turned into tools for data security defense. Many organizations utilize artificial intelligence to study information from incoming connections to computers, allowing them to identify patterns that human security specialists might miss. These big data-driven sentries are always on guard, and they have the capacity to learn and adjust as attacks evolve.

Compliance Requirements

As more incidents have occurred, governments have begun to catch up. Extremely strict rules went into effect in the European Union in 2018 with the advent of the General Data Protection Regulation. Fines for failures of data security compliance could hypothetically reach into the tens of billions in Euros, and any company that interacts with even one citizen of an EU member state may be exposed to liability. That’s a huge risk for a shop somewhere in America to take just to make a couple more sales overseas. For those assembling true big data resources, that poses an existential risk.

Solutions

Companies need to stop seeing IT security as the domain of a specific department and start treating it as a part of an organization-wide cultural shift. All the way up to the C-suite level, training needs to be provided to ensure that decision-makers fully appreciate the challenges before their companies. Encryption of data, especially with the arrival of the GDPR and its accompanying regime of fines, should be seen as non-negotiable.

A threat response team should also be in place at every organization that collects big data information. Being able to identify and respond to attacks sooner is estimated to reduce the per customer cost of a breach by $14, according to one report. By taking a proactive approach, an enterprise can reduce its exposure and improve overall time to bounce back.

Are You Protected?

Many big data tools, even in 2018, are not as up-to-date on their security as you would expect. These tools lack security details such as multi-factor authentication, breach alerts, and much more. Your company’s data is irreplaceable and unlike any other; using software without maximum security is like throwing out thousands, even millions, of dollars. Invest in software that puts your data’s security before anything else.

Categories
Big Data Data Analytics

Data Science 101: Who, What, and Why

What is Data Science, and What is its Purpose?

Big data revolves around the idea that companies can acquire and process large quantities of information in a manner that allows them to make predictions with a high probability of being accurate (i.e. a fashion purchaser might utilize data analytics gleaned from social media trends to identify what will be popular in the coming season.) This can allow businesses to get out in front of competitors as soon as possible.


The field is grounded in a set of technical and mathematical skills that are collectively called big data. Programming skills, in particular, rate highly in the industry. The three
most commonly used programming languages in the business are Python, R, and Java. These are used to handle both the acquisition and processing of data.

Machine learning and artificial intelligence are also beginning to play bigger roles in the industry. Having gathered massive amounts of data, a company may lean on an artificial intelligence application to drill down through an amount that no human could reasonably read in one lifetime and generate insights. Python is by far the dominant programming language for AI applications, thanks to its seamless integration with an array of systems such as Tensorflow.

Storage is also a big deal when it comes to the business of data. The simple act of collecting information calls for massive databases, and processed information demands another layer of storage. The NoSQL database languages are popular for data analytics firms.

Who are Data Scientists?

Data scientists tend to be individuals with strong math and statistics backgrounds who also have some degree of programming ability. They can readily propose potential explanations for or solutions to problems and then devise mathematically sound tests to verify or rule out their ideas. 

Many multinational companies employ data scientists, and smaller businesses are also starting to pick up on the trend. Netflix, for example, processes user data to such an extent that it even utilizes customers’ preferences to decide who to cast in shows and movies. Google employs artificial intelligence to examine emails that have been marked spam in order to do a better job of identifying future spamming efforts. Even the self-driving car revolution is being propelled by machine learning technologies designed to recognize traffic, people, animals and obstacles.

Churning through a very large amount of information is critical to the process, but good data scientists also know how to present insights to decision-makers. This includes using business intelligence platforms to show trends and predictions. Condensing all of their data into graphs and charts that can quickly be scanned and understood is what separate good data scientists from great ones.

Why is Data Science Important and Useful?

Data-driven decision-making has found a home in a wide range of industries. For organizations that don’t have the resources to go toe-to-toe with larger competitors, a dedicated & data-centric approach is a secret weapon. For a bigger enterprise, the goal is to get in front of scrappy upstarts by building their own highly competent analytics departments while also investing in the smartest data analytics platforms.

It isn’t a secret that companies in every industry are beginning to lean on data science and analytics to discover a source of power for future success. Over 90% of all of the data in the world has been collected in the past 2 years, and it is still continuing to grow. Any business that ignores this fact will fall behind its competitors…very quickly.

Data Science & Inzata

Every current great data scientist will agree that data analysis applications without artificial intelligence and machine learning will soon be outdated.

The first and only of its kind, Inzata is an AI-powered data analytics platform hosted in the cloud, allowing for optimal processing and speed. Our full service platform covers every inch of a data scientist’s daily tasks, from data ingestion, to enrichment, to modeling, and even curating insightful & readable visualizations for the rest of the company.

Inzata takes care of the tedious and pain-in-the-butt tasks of data science on a local machine, such as restructuring your data, without requiring any extra coding or data architects all at an impeccable speed thanks to our patented aggregation engine. By seamlessly fitting into existing workflows, Inzata is the only tool needed to work with your past, present, and future data scripts.

For example, if you have 3 tables that you want to combine to analyze in R, you would have 3 options:

  1. Manually merge the tables in R and then do the analysis there (Good luck with that, you’ll run out of RAM very quickly and if you map-to-disk it will take long enough for you to watch an episode of your favorite show before it’s done)
  2. Manually merge the tables using SQL, then export the data into R (SQL joins are also very slow and a ton of prep-work with SQL tables is necessary or else you will most likely run into a ton of technical issues, also now you need to know an additional language)
  3. Use Inzata AI to automatically merge your tables then export the data to R in the exact structure you want in just a few minutes.

Avoiding the hours typically wasted on waiting on other programs to process data is as easy as upgrading to the most intelligent data analytics platform on the market: Inzata.[/vc_column_text][vc_column_text]Written by: Nicole Horn, Alex Durante

September 2018

Categories
Big Data Data Analytics Data Enrichment Data Monetization

The Immense Value Behind Data Enrichment with Secondary Data

Techopedia defines data enrichment as “processes used to enhance, refine or otherwise improve raw data.” Raw data is just the seed, and data enrichment is the light needed to grow it into a strong, useful, and valuable mechanism for your business.

Ultimately, the goal of data enrichment is to boost the data that you are currently storing with secondary data. Whether it is at the point of capture or after the data is accumulated, adding insights from reliable information sources is where the real value is gained. In other words, data enrichment is journey of transforming your raw, commodity data into a true asset to your organization, project, or research.

Refining raw data should include the following steps:

  • Removing errors such as null or duplicate values
  • Using data profiling to clarify the content, relationships, and structure of the data
  • Improving the data quality overall to increase its reliability and analytical value
  • Strategically adding additional attributes, relationships, and details that uncover new insights around your customers, operations, and competition from secondary data

Data refinement avoids the negative outcomes of attempting to work with bad data. Low quality data can have serious negative impacts on your project. It can needlessly increase costs, waste precious time, cripple important decision making, and even anger clients or customers.

During or after the refinement of your data, enriching it with advanced data dimensions such as detailed time frames, geography details, weather history, and even a wide variety of customer demographics from multiple secondary data libraries is key to unleashing its true value to your company, customers, and shareholders.

  • What if you could predict which clients are most likely to buy, and exactly how much they will spend, just from their initial lead profile?
  • What if you could identify the key success characteristics of a new market or store location, just from viewing the demographics of the area?
  • How much easier would day-to-day decisions become if you could consider all of the factors involved, instead of just a few?

You will acquire a better and more complete understanding of your prospects and target market. You will learn more about your market by appending business information to the records that you capture and store, pinpointing key sociodemographic groups of business prospects, or improving efficiencies across your business units.

Most would agree that data enrichment with secondary data is valuable, but why do less than 10% of companies do it? The simplest answer is “it’s hard.” It’s time consuming and labor-intensive to gather and maintain all of these various enrichments. It’s hard to thread and blend data together AND keep it all accurate and organized. Let’s face it, most business professionals barely have time to analyze the data in front of them, much less go out and find other sources.

Let’s Talk About Inzata

Inzata is a data analytics platform designed to change all of that. Inzata offers a growing list of more than 25 separate enrichments, ranging from things like geospatial and location enrichments, to weather data and advanced customer demographics down with street level accuracy.

Data enrichment is a core function with Inzata, it’s designed as an integral part of our Agile AnalyticsTM, the workflow that uses technology to turn raw data into digital gold. 

Secondary data is the key concept of data enrichment, such as advanced customer demographics, which is arguably the strongest data enrichment a company could use to add an immense amount of value to their customer data. Unlike any other data analytics platform, Inzata has over 150 customer demographics from the entire nation built right into the platform for one-click access at all times. Some of these enrichments include:

  • Income brackets
  • Employment
  • Occupation
  • Housing occupant/valuation
  • Marital Status
  • Education level
  • Industry facts

Enriching your customer data in this way greatly increases the value and precision of your analysis, and allow you to answer much more complex questions about your business. Inzata makes enriching your data as simple as selecting which attributes you want to add, and instantly adding them to your data.

These enrichments are absolutely priceless for companies with big data on their hands. Being able to slice and dice your large datasets by these detailed demographics and behavioral characteristics makes them more precise, more manageable, and better able to tell you what’s actually going on inside your business. Think of enrichment as a force-multiplier for your big data initiative. Knowing more about your customers, your transactions. Failing to enrich a mass amount of simple customer data for your own benefit is like choosing a 2005 flip-phone over a 2018 smartphone.

A Harvard Business Review1 article mentions two very important statistics that easily prove why data enrichment is absolutely crucial:

  • On average, 47% of newly created data records have at least one critical & work-impacting error.
  • Only 3% of the data quality scores in their study can be rated “acceptable” using the loosest-possible standard.

Any business can easily avoid falling into these negative statistics by investing in the correct data analytics platform that provides powerful enrichments for top-notch data refinement and enhancement through a variety of secondary data sources.

Inzata’s platform is the first and only of its kind to include one-click enrichments for any data, from any source, for any business. Stay ahead of the curve in data analytics and invest in the best, invest in Inzata.

Sources

1Only 3% of Companies’ Data Meets Basic Quality Standards, https://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards

Categories
Uncategorized

Top 3 Frustrations in Preparing Data for Analysis

The era of Big Data is upon us, and with it, business leaders are finding new insights in their data analytics to drive their tactical and strategic decisions. Data visualization tools are widely available from many vendors including Tableau, Qlik (Qlikview) Microsoft (PowerBI).

The question is no longer ‘are you using Big Data?’ but rather, ‘why not?’

 
Visualization vendors make data analytics sound easy; make your data accessible to our tools, push a button, and wondrous visual displays uncover never before seen insights into your business. One fact, however, is always downplayed; you actually have to prepare data for analysis. Great visualization tools are worthless if the data is not organized and prepped beforehand. For many data analysis projects and initiatives, the data prep itself can be the most time-consuming and repetitive step.
 
Here are, from our point of view, the top 3 challenges of the data prep process, and how to overcome them.
 

Frustration #1: Merging Data from Different Sources

 
Analysts want to jump right into the data analytics and uncover the promised insights, but first they have to follow the processes for data loading and making the data available to the analytics engine. Easily done if all of the necessary data is in a single data set; but it rarely is.
 
Data exists in many different systems, from finance to engineering to CRM systems, and both public and private sources. The number one challenge for data prep is the data munging and merging that must take place as you merge data from different systems. And it’s never easy. Simple nuances in the data are often the toughest part. 
 
The data, data structures, and even the definition of what the data reflects varies from one system to another, and the primary challenge of the data transformation is to merge it together in a consistent manner.
 
Time stamps that contain both time and date in one file, but time and date are in separate columns in another file, and must be merged first. Something as simple as how phone numbers and zip codes are formatted can wreak havoc on your results. This is the unspoken reality for the data analyst or scientist: the data is often not your friend.
 
At Inzata, we watched customers struggling with this challenge, and we have a better way. We noticed that a lot of the work was repetitive, and often involved simple (for a human) operations on the data. So we developed Artificial Intelligence that could perform these tedious tasks, requiring only occasional guidance from a human. We call it AI-Assisted Data Modeling, and it takes you from raw, disorganized data to beautiful analytics and visualizations, in under 30 minutesData analytics is no longer a strenuous task with Inzata’s full service platform. 
 
 

Frustration #2: Lack of Common Ground between the Analyst and IT

 
The analyst is a subject matter expert in her field, the IT pro knows the systems. But quite often, they don’t know much about the other’s role, and can’t speak the same language on requirements to prepare data for analysis. The analyst requests data from the IT pro, and files get sent and delivered in email and dropboxes.
 
In many cases, the data munging process becomes one of trial and error (request a file, work on it, discover changes, request a new file) until finally, after many iterations, the output of Microsoft PowerBI, Qlikview, Tableau or whatever other analytics tools are used delivers the right content.
 
But what if you could work with data in its native source format, coming directly from your source systems, with no ETL middleware or IT guy to call? Inzata lets you organize your connections to source systems (both inside your company and in the cloud) and reads in data in its native physical form. This makes the first steps of data analysis a breeze.
 
Inzata then helps you rapidly create business data models mapped to this native structure. Your days of transforming raw data and generating new files for analysis are behind you. We’ve taken these tedious tasks of data analytics and did it for you.
 
Everything else you do in Inzata is driven by these virtual data models, so your users and analysts only see highly organized data, structured and merged in a way they can understand it, because it resembles your actual business. Updates are no problem, when new data is ready from source systems, it automatically updates the Inzata dataset and your reports update in real-time.
 
Field names are in English, not computer-ese, and oriented around the things you care about. Customers. Transactions. Employees. These are the “things” you interact with in Inzata, just like in the real world. Data is displayed visually, no code to write. Data aggregations, rollups and filters become their own reusable objects. Creating a new dashboard is as simple as dragging those elements to a blank canvas and giving it a name.
 
What if something changes in the source system; Field names change or new fields are added? In the past this would wreck everything. Reports would stop working and you’d have to start over from scratch. Inzata anticipates these slowly-changing-dimensions, and detects when they happen. It automatically reallocates data elements to accommodate changes, and everything keeps working. The end result: you don’t need to worry about changes in source systems wrecking your reports and dashboards.
Learn More
 

Frustration #3: Missing Audit Trail

 
This part is very important for anyone who uses (or is thinking about using) a Data Prep tool, Excel or something similar that outputs files.
 
Insights gained through data analytics can give decision makers reason to make significant changes to the business. A lot is riding on these decisions, and there has to be confidence and accuracy in the data and insights. But after the data is merged from several sources, goes through various transformations, and gets reloaded, it becomes hard to track backwards from the insight to the original data. Honestly, are you going to remember the exact transform steps you did on a dataset you haven’t touched in 3 months? The lack of an audit trail weakens the confidence that the team can have in the outputs.
 
As a data scientist, you’ve worked hard to develop your reputation and credibility. Shouldn’t your tools also be up to the challenge?
 
By their very nature, file-based Data Prep tools cannot deliver this kind of confidence and auditing, because they are only in possession of the data for a short time. There’s nothing to link the final file with the original data, or the process of data analysis it underwent in the tool. They don’t maintain chain-of-custody to protect the data.
 
Inzata does.
 
From the moment your data enters Inzata’s ecosystem, every activity is meta-tagged. We track everything that happens with your data, who touches it, accesses it, what transformations or enrichments it goes through. We also have intelligent temporal shifting, which is a fancy way of saying we invented time travel. (At least for your data, that is.)
 
Here’s how: Inzata stores each incremental state of your data. If you want to see exactly how a report looked 3 months ago, we can show it to you. If you need to re-create a list-query exactly as it would have looked 5 days ago, we can do that.
 
 
Conclusion
 
Data preparation, and the challenges entailed, is the dirty little secret of big data analytics. The potential insights into your business are valuable, but the process can be so frustrating at times that the projects die on the vine. It’s time to spend as much time looking at data transformation tools that can take the human out of the equation as you do looking at data analytics tools.
Categories
Big Data Data Enrichment Data Monetization

The Chief Data Monetization Officer: Turn Big Data into Profit

Humans produce around 2.5 quintillion bytes of data daily. However, over 90% of data collected is never read or analyzed. Data monetization is the process of putting your data to work, resulting in economic benefit.

In many businesses, the amount of data that goes unanalyzed is much higher, approaching 100%. We’re spending millions to collect and store this resource, but we’re only putting a tenth of it to practical use. That’s like finding a massive oil deposit underground, and just pumping the crude up to the surface and storing it in huge tanks.

So the problem is not that there isn’t enough data. We have plenty of data, and exceedingly good at collecting and making more.

The problem is one of refinement and distribution. Monetizing oil requires refineries, trucks and gasoline stations to get it to market. Without those, the oil is worthless.

Big Data is not of much value unless it’s driving profit and positive change in the enterprise. Once you’ve figured out how to do that, its value skyrockets.

The one big difference between data and oil is that you can only refine oil into a product once, then it’s gone. Data stays around. You can keep monetizing the same data over and over by refining it, analyzing it, combining it, and produce valuable new assets over and over.

The right insights at the right time can be priceless. They can save lives, avert disasters, and help us achieve incredible outcomes.

Great data projects start with great questions. Not “interesting” or “nice to have” questions, but truly great questions that, when answered, will visibly move the needle on the business.

Unfortunately, most business leaders aren’t used to walking around the office asking impossible questions that seemingly no one can answer. But that’s exactly what I encourage them to do.

The most valuable person at the start of any Big Data project is the person who understands what’s possible with Data Monetization. It takes vision, and their confidence gives others the courage to ask the hard questions.

It’s not enough to just collect and work with data. The questions don’t come from the data, the answers do. It’s your job to come up with the best questions.

Organizations across all industries have large volumes of data that could be used to answer consumer and business questions or drive dta monetization strategies.

This requires a skill many organizations have yet to develop. To get the maximum economic value from data monetization, organizations should shift their emphasis from Chief Data Officers, or CDOs, to Data Monetization Officers.

Low-cost BI analytical platforms are revolutionizing the way the world makes decisions. A bold claim? Not really. To help us examine the impact of widely used BI platforms with Big Data will have, let us describe how the information sharing and data monetization process works.

Chief Data Officers typically come from an IT background and report to the CIO. A DMO comes from a business background and understands how the business functions the way a COO or CFO would. They’re tasked with using data to provide direct, measurable benefits to the business. Their job is to monetize the company’s information assets. They have an inclination toward revenue growth and are skilled in finding new data monetization revenue opportunities and customers.

The DMO has a strong affinity for measurement. This shouldn’t be much of a stretch for someone with “data” in their job title, but they need to be willing to apply it to themselves as often as necessary. They need to be picky in choosing the truly “great ideas” for data monetization. They need to resist the ones that won’t improve business performance, no matter how neat they sound.

Smart organizations understand the benefits of having someone focusing on extracting business value from data and charting ata monetization strategy.

By 2022, most companies will have a specialized resource, or DMO, in charge of managing and monetizing the company’s most valuable asset: its data.

If you’re reading this thinking “We don’t have enough data to justify this kind of role,” Think again. Most companies already have more than enough data to make an initiative like this worthwhile.

I’d love to know what you think.

Would your company benefit from someone in charge of managing the ROI of data?

Could dataetization change the way you look at your data, and possibly create opportunity for profits?

How effective is your organization at leveraging data and analytics to power your business?

Are you a candidate for this type of role?

  • Do you understand your organizations key business initiatives and what data reflects how they are doing? Do you understand and track leading success?
  • Can you estimate the economic value of your data both inside and outside of your company?
  • Do you have the skills and tools to exploit this economic value?

Learn more about Inzata, the first Analytics platform designed for Data Monetization.

Polk County Schools Case Study in Data Analytics

We’ll send it to your inbox immediately!

Polk County Case Study for Data Analytics Inzata Platform in School Districts

Get Your Guide

We’ll send it to your inbox immediately!

Guide to Cleaning Data with Excel & Google Sheets Book Cover by Inzata COO Christopher Rafter