Blog

Don’t Make Big Decisions without Big Data

George Gosselin

As a business executive or senior manager, no one doubts your experience or ability to lead based on raw instincts alone… but massive amounts of historic, current, and predictive trending data are also readily available right now to help you. There’s never been a better time to leverage the power of big data analytics to accelerate your data-driven revenue and performance goals.

Data has become critical to the daily operation of almost every company in almost every industry.

In all shapes and sizes, in large cities and small towns, your last mortgage payment, your flight details, your banking and credit card transactions, and the gaming app your kids just downloaded, are all examples of the growing volume of data being stored at a rate well-beyond our human ability to interpret or even digest in real time.

The vast array of automated data sources available today provides measurable information about all aspects of a company’s business. Opportunities are also available today to help your teams apply new technologies to convert that raw data into answers.

Consider the following supportive points:

  • International Data Corporation (IDC) researchers predict the business analytics and big data market will expand 50 percent by the year 2020 with global annual revenues approaching $190 billion.
  • A study by the MIT Sloan Center for Digital Business recently concluded that enterprises that invest in data-driven infrastructures were over 25 percent more profitable.
  • A 2016 IDC publication on the specific advantages of a Cisco UCS integrated infrastructure reported a one-third drop in operational costs and a 56 percent increase in time-to-market decisions.

Emergence of Big Data Architectures

Without a way to normalize, store, scrub, filter, sort, and report on the data, your company has a lot of flour but no bread. Mountains of data would be useless. This is where data scientists and specialists in big data analytics, business intelligence, and advanced reporting have made their new home.

Years ago, only the largest enterprises could afford to invest in data warehouses with their massive disk farms and compute resources. They met the demands of ERP and financials at the time, but were brought to their knees by the emerging business requirements that resulted in new attempts by IT departments to mine the data to gain business benefit.

The ERP systems of yesterday are still viable but mining data, specifically looking for business advantages, requires a new approach. This is where big data architectures that are capable of storing, searching, reporting, and correlating data on a massive scale come into play.

Several distinct advantages of this architecture include:

  • Distributed Performance and Cost Savings: Rather than relying on a small number of very powerful costly systems tied to a disk farm, the new architecture can divide up the data among large numbers of fairly standard less expensive servers with direct attached storage. Intense number crunching is easier and faster to process if we divide the workloads into separate tasks and execute them in parallel over a large number of machines.
  • Parallel Processing: Absolutely essential for saving time and reducing workload processing costs, the parallel job processing of large data sets can be achieved in on-premise or cloud-based big data solutions.
  • Automation and Orchestration: Options are available for scaling or sharing resources. You can scale your big data cluster to meet the demands of your business without having to over-purchase. Some members of the cluster can perform multiple tasks. When they are not needed for big data queries or reports they can be used by application development, QA testing, or other data process bursting. This is facilitated through automation and orchestration and is typically performed seamlessly to end-users.
  • On-Prem, Cloud, or Hybrid Approach: Deciding between on-premise or cloud involves a thorough understanding of traffic flows, data storage and retention, the frequency of data access, the complexity of your extract, transform, and load (ETL) process, data transfer requirements, and other factors. The good news is that you don’t have to decide right away. You can preserve current investments now and gradually transition to an optimized architecture later. For example, you might decide to augment your existing data warehouse today. Then over time, you could replace the aging expensive data warehouse with a more cost-effective big data architecture. This approach allows you to vastly improve your data processing power as your business expands without straining fiscal budgets. You can begin gathering insight into your business and investigating new market opportunities without massive investments in hardware and without rebuilding current systems.

Planning Big Data Projects

  • Infrastructure: Invest in a well-architected physical compute, network, and storage infrastructure.
  • Data Science Professionals: In the United States alone, companies employ over 35,000 jobs in computer data and information research. Double-digit growth is expected in this field.
  • Business Personnel and SMEs: The data scientists understand the data but do not know the lines of business as well as your managers and subject matter experts (SMEs) who can work with the data scientist to develop goals and to distinguish errors, fads, or data anomalies from meaningful trends.
  • Leadership Approval: Identify corporate sponsors and sources of funding at your organization. They will be able to create a culture of excellence, transparency, and a sense of mission. These leaders are crucial to getting approval, buy-in, and promoting the initiative in order for it to have a chance of succeeding.

When any of these key ingredients are missing, your big data project could miss high-level goals, produce mediocre results, or even fail.

When appropriately planned, your big data project will deliver improvements in performance and financial results. For example, Cisco partners with leading independent software vendors to provide best-in-class data management, data preparation, and analytics. More than 85 percent of all Fortune 500 customers have invested in Cisco UCS infrastructures.

The ability to prepare and correctly use your data is vital for success, and a common concern is determining the right strategy for your big data and analytics services. Data and analytics are solving today’s business problems and delivering tomorrow’s business insights. Companies without solutions to aggregate their data and provide impactful, self-service analytics will fail to achieve operational efficiency.

George Gosselin

George Gosselin, Chief Technology Officer, CDI

George Gosselin was named Chief Technology Officer at CDI in May 2010. In this role, he is responsible for spearheading the evolution of the company’s technical skillset and with staying informed of critical core technologies that will drive CDI’s services and solutions into the future. Prior to his role as CTO, George was director of networking systems, responsible for all aspects of the company’s network consulting business including routing and switching, wireless, security, IP telephony, videoconferencing and operational management. His key achievements include developing CDI’s networking practice and driving the company to earn Cisco Gold Certified status. George graduated in 1987 from Saint Lawrence University with a Bachelor of Arts in economics and a concentration in computer science.