Ethics in AI
July 11th, 2019
Big data is a popular term used to acknowledge the exponential growth, availability and use of information in the data-rich landscape of tomorrow. The term “big data” puts an inordinate focus on the issue of information volume (in every aspect from storage through transform/transport to analysis). Big data is also heavily weighted toward current issues and can lead to short-sighted decisions that will hamper the enterprise’s information architecture as IT leaders try to expand and change it to meet changing business needs.
nformation managers may be tempted to focus on volume alone when they are losing control of the access and qualification aspects of data at the same time. Gartner analysts warn that too narrow a focus will force massive reinvestment in two to three years to address the other dimensions of big data.
“Today’s information management disciplines and technologies are simply not up to the task of handling all these dynamics. Information managers must fundamentally rethink their approach to data by planning for all the dimensions of information management,” said Mark Beyer, research vice president at Gartner. “The business’s demand for access to the vast resources of big data gives information managers an opportunity to alter the way the enterprise uses information. IT leaders must educate their business counterparts on the challenges while ensuring some degree of control and coordination so that the big-data opportunity doesn’t become big-data chaos, which may raise compliance risks, increase costs and create yet more silos.”
Worldwide information volume is growing annually at a minimum rate of 59 percent annually, and while volume is a significant challenge in managing big data, business and IT leaders must focus on information volume, variety and velocity.
Volume: The increase in data volumes within enterprise systems is caused by transaction volumes and other traditional data types, as well as by new types of data. Too much volume is a storage issue, but too much data is also a massive analysis issue.
Variety: IT leaders have always had an issue translating large volumes of transactional information into decisions — now there are more types of information to analyze — mainly coming from social media and mobile (context-aware). Variety includes tabular data (databases), hierarchical data, documents, e-mail, metering data, video, still images, audio, stock ticker data, financial transactions and more.
Velocity: This involves streams of data, structured record creation, and availability for access and delivery. Velocity means both how fast data is being produced and how fast the data must be processed to meet demand.
While big data is a significant issue, Gartner analysts said the real issue is making sense of big data and finding patterns in it that help organizations make better business decisions. “The ability to manage extreme data will be a core competency of enterprises that are increasingly using new forms of information — such as text, social and context — to look for patterns that support business decisions in what we call Pattern-Based Strategy,” said Yvonne Genovese, vice president and distinguished analyst at Gartner. “Pattern-Based Strategy, as an engine of change, utilizes all the dimensions in its pattern-seeking process. It then provides the basis of the modeling for new business solutions, which allows the business to adapt. The seek-model-and-adapt cycle can then be completed in various mediums, such as social computing analysis or context-aware computing engines.”
Additional analysis is available in the Gartner Special Report “Pattern-Based Strategy: Getting Value from Big Data”. The report provides links to numerous reports that examine key issues related to managing big data. The Special Report includes video commentary from Ms. Genovese, as well as a Talking Technology interview that reviews the term big data, and why IT leaders should act now. The Special Report is available at www.gartner.com/patternbasedstrategy.
Source: Gartner Newsroom