The Data Century

The 20th century introduced the “Atomic Age.” With the 21st be the “Data Century?”

Source: Wikipedia

The 20th century arguably began, conceptually, when Albert Einstein published his seminal paper on special relativity. The intellectual upheaval this initiated ultimately led to the expression E = mc2, nuclear energy, and the atomic bomb. Atomic weapons and nuclear power revolutionized war, diplomacy, geopolitics, and—to a lesser extent—electricity generation. Although a nuclear reaction is still a pretty inefficient way to boil water.

The 21st century began with the dot-com boom and bust, and the proliferation of data networks, data mining, data analysis, and data storage. “Big Data” is the term we use for data sets too large and complex for traditional data management approaches. There are now more than 1 trillion websites with more than 10 trillion individual web pages, with more than 500 exabytes of data. (An exabyte is 1 million terabytes.) And the data keep growing exponentially.

Author: Hilbert, M., Science: 332(6025): 60-65. Source: Wikipedia

The complexity can boggle our minds, in the same way that the implications of relativity did a hundred years ago: time doesn’t proceed at a uniform pace across the universe, light bends around large mass objects, and the universe’s mass-energy is gradually running down. But while the atomic age gave us immensely powerful weapons, it was still governed by rather prosaic rules of governing: the military must be subordinate to civilian authority, war is an extension of diplomacy, and so on. Nuclear power didn’t fundamentally change human nature.

Likewise, the rules of big data still have to follow the rules of “little data”: correlation does not equal causation, bad data leads to bad conclusions, and there’s always more data to collect, among others. Just because we have more information doesn’t mean our thinking is any better. Indeed, one of the most important axioms of data modelling is that accuracy is more important than precision.

Still, there’s a lot of new material out there. We can now use satellite images to gauge industrial activity in remote locations, we can screen-scrape a billion prices per day to estimate inflation around the world, we can look at lending and borrowing activity in real-time. This has the potential to make our economic analysis and investment decisions far more efficient.

Just don’t expect a revolution. “Big Data” may allow us to be more informed, but it won’t make us any smarter.

Douglas Tengdin, CFA

Charter Trust Company

By | 2017-07-17T12:22:06+00:00 March 31st, 2016|Global Market Update|0 Comments

About the Author:

Mr. Tengdin is the Chief Investment Officer at Charter Trust Company and author of “The Global Market Update”. The audio version of each post can be heard on radio stations throughout New England every weekday. Mr. Tengdin graduated from Dartmouth College, Magna Cum Laude. He received his Master of Arts from Trinity Divinity School, Magna Cum Laude and received his Chartered Financial Analyst (CFA) designation in 1992. Mr. Tengdin has been managing investment portfolios for over 30 years, working for Bank of Boston, State Street Global Advisors, Citibank – Tunisia, and Banknorth Group. Throughout his career, Mr. Tengdin has emphasized helping clients manage their financial risks in difficult environments where they can profit from investing in diverse assets in diverse settings. - Leave a comment if you have any questions—I read them all! - And Follow me on Twitter @GlobalMarketUpd

Leave A Comment