ArticlesBlog

The human insights missing from big data | Tricia Wang

The human insights missing from big data | Tricia Wang


In ancient Greece, when anyone from slaves to soldiers,
poets and politicians, needed to make a big decision
on life’s most important questions, like, “Should I get married?” or “Should we embark on this voyage?” or “Should our army
advance into this territory?” they all consulted the oracle. So this is how it worked: you would bring her a question
and you would get on your knees, and then she would go into this trance. It would take a couple of days, and then eventually
she would come out of it, giving you her predictions as your answer. From the oracle bones of ancient China to ancient Greece to Mayan calendars, people have craved for prophecy in order to find out
what’s going to happen next. And that’s because we all want
to make the right decision. We don’t want to miss something. The future is scary, so it’s much nicer
knowing that we can make a decision with some assurance of the outcome. Well, we have a new oracle, and it’s name is big data, or we call it “Watson”
or “deep learning” or “neural net.” And these are the kinds of questions
we ask of our oracle now, like, “What’s the most efficient way
to ship these phones from China to Sweden?” Or, “What are the odds of my child being born
with a genetic disorder?” Or, “What are the sales volume
we can predict for this product?” I have a dog. Her name is Elle,
and she hates the rain. And I have tried everything
to untrain her. But because I have failed at this, I also have to consult
an oracle, called Dark Sky, every time before we go on a walk, for very accurate weather predictions
in the next 10 minutes. She’s so sweet. So because of all of this,
our oracle is a $122 billion industry. Now, despite the size of this industry, the returns are surprisingly low. Investing in big data is easy, but using it is hard. Over 73 percent of big data projects
aren’t even profitable, and I have executives
coming up to me saying, “We’re experiencing the same thing. We invested in some big data system, and our employees aren’t making
better decisions. And they’re certainly not coming up
with more breakthrough ideas.” So this is all really interesting to me, because I’m a technology ethnographer. I study and I advise companies on the patterns
of how people use technology, and one of my interest areas is data. So why is having more data
not helping us make better decisions, especially for companies
who have all these resources to invest in these big data systems? Why isn’t it getting any easier for them? So, I’ve witnessed the struggle firsthand. In 2009, I started
a research position with Nokia. And at the time, Nokia was one of the largest
cell phone companies in the world, dominating emerging markets
like China, Mexico and India — all places where I had done
a lot of research on how low-income people use technology. And I spent a lot of extra time in China getting to know the informal economy. So I did things like working
as a street vendor selling dumplings to construction workers. Or I did fieldwork, spending nights and days
in internet cafés, hanging out with Chinese youth,
so I could understand how they were using
games and mobile phones and using it between moving
from the rural areas to the cities. Through all of this qualitative evidence
that I was gathering, I was starting to see so clearly that a big change was about to happen
among low-income Chinese people. Even though they were surrounded
by advertisements for luxury products like fancy toilets —
who wouldn’t want one? — and apartments and cars, through my conversations with them, I found out that the ads
the actually enticed them the most were the ones for iPhones, promising them this entry
into this high-tech life. And even when I was living with them
in urban slums like this one, I saw people investing
over half of their monthly income into buying a phone, and increasingly, they were “shanzhai,” which are affordable knock-offs
of iPhones and other brands. They’re very usable. Does the job. And after years of living
with migrants and working with them and just really doing everything
that they were doing, I started piecing
all these data points together — from the things that seem random,
like me selling dumplings, to the things that were more obvious, like tracking how much they were spending
on their cell phone bills. And I was able to create
this much more holistic picture of what was happening. And that’s when I started to realize that even the poorest in China
would want a smartphone, and that they would do almost anything
to get their hands on one. You have to keep in mind, iPhones had just come out, it was 2009, so this was, like, eight years ago, and Androids had just started
looking like iPhones. And a lot of very smart
and realistic people said, “Those smartphones — that’s just a fad. Who wants to carry around
these heavy things where batteries drain quickly
and they break every time you drop them?” But I had a lot of data, and I was very confident
about my insights, so I was very excited
to share them with Nokia. But Nokia was not convinced, because it wasn’t big data. They said, “We have
millions of data points, and we don’t see any indicators
of anyone wanting to buy a smartphone, and your data set of 100,
as diverse as it is, is too weak for us to even take seriously.” And I said, “Nokia, you’re right. Of course you wouldn’t see this, because you’re sending out surveys
assuming that people don’t know what a smartphone is, so of course you’re not going
to get any data back about people wanting to buy
a smartphone in two years. Your surveys, your methods
have been designed to optimize an existing business model, and I’m looking
at these emergent human dynamics that haven’t happened yet. We’re looking outside of market dynamics so that we can get ahead of it.” Well, you know what happened to Nokia? Their business fell off a cliff. This — this is the cost
of missing something. It was unfathomable. But Nokia’s not alone. I see organizations
throwing out data all the time because it didn’t come from a quant model or it doesn’t fit in one. But it’s not big data’s fault. It’s the way we use big data;
it’s our responsibility. Big data’s reputation for success comes from quantifying
very specific environments, like electricity power grids
or delivery logistics or genetic code, when we’re quantifying in systems
that are more or less contained. But not all systems
are as neatly contained. When you’re quantifying
and systems are more dynamic, especially systems
that involve human beings, forces are complex and unpredictable, and these are things
that we don’t know how to model so well. Once you predict something
about human behavior, new factors emerge, because conditions
are constantly changing. That’s why it’s a never-ending cycle. You think you know something, and then something unknown
enters the picture. And that’s why just relying
on big data alone increases the chance
that we’ll miss something, while giving us this illusion
that we already know everything. And what makes it really hard
to see this paradox and even wrap our brains around it is that we have this thing
that I call the quantification bias, which is the unconscious belief
of valuing the measurable over the immeasurable. And we often experience this at our work. Maybe we work alongside
colleagues who are like this, or even our whole entire
company may be like this, where people become
so fixated on that number, that they can’t see anything
outside of it, even when you present them evidence
right in front of their face. And this is a very appealing message, because there’s nothing
wrong with quantifying; it’s actually very satisfying. I get a great sense of comfort
from looking at an Excel spreadsheet, even very simple ones. (Laughter) It’s just kind of like, “Yes! The formula worked. It’s all OK.
Everything is under control.” But the problem is that quantifying is addictive. And when we forget that and when we don’t have something
to kind of keep that in check, it’s very easy to just throw out data because it can’t be expressed
as a numerical value. It’s very easy just to slip
into silver-bullet thinking, as if some simple solution existed. Because this is a great moment of danger
for any organization, because oftentimes,
the future we need to predict — it isn’t in that haystack, but it’s that tornado
that’s bearing down on us outside of the barn. There is no greater risk than being blind to the unknown. It can cause you to make
the wrong decisions. It can cause you to miss something big. But we don’t have to go down this path. It turns out that the oracle
of ancient Greece holds the secret key
that shows us the path forward. Now, recent geological research has shown that the Temple of Apollo,
where the most famous oracle sat, was actually built
over two earthquake faults. And these faults would release
these petrochemical fumes from underneath the Earth’s crust, and the oracle literally sat
right above these faults, inhaling enormous amounts
of ethylene gas, these fissures. (Laughter) It’s true. (Laughter) It’s all true, and that’s what made her
babble and hallucinate and go into this trance-like state. She was high as a kite! (Laughter) So how did anyone — How did anyone get
any useful advice out of her in this state? Well, you see those people
surrounding the oracle? You see those people holding her up, because she’s, like, a little woozy? And you see that guy
on your left-hand side holding the orange notebook? Well, those were the temple guides, and they worked hand in hand
with the oracle. When inquisitors would come
and get on their knees, that’s when the temple guides
would get to work, because after they asked her questions, they would observe their emotional state, and then they would ask them
follow-up questions, like, “Why do you want to know
this prophecy? Who are you? What are you going to do
with this information?” And then the temple guides would take
this more ethnographic, this more qualitative information, and interpret the oracle’s babblings. So the oracle didn’t stand alone, and neither should our big data systems. Now to be clear, I’m not saying that big data systems
are huffing ethylene gas, or that they’re even giving
invalid predictions. The total opposite. But what I am saying is that in the same way
that the oracle needed her temple guides, our big data systems need them, too. They need people like ethnographers
and user researchers who can gather what I call thick data. This is precious data from humans, like stories, emotions and interactions
that cannot be quantified. It’s the kind of data
that I collected for Nokia that comes in in the form
of a very small sample size, but delivers incredible depth of meaning. And what makes it so thick and meaty is the experience of understanding
the human narrative. And that’s what helps to see
what’s missing in our models. Thick data grounds our business questions
in human questions, and that’s why integrating
big and thick data forms a more complete picture. Big data is able to offer
insights at scale and leverage the best
of machine intelligence, whereas thick data can help us
rescue the context loss that comes from making big data usable, and leverage the best
of human intelligence. And when you actually integrate the two,
that’s when things get really fun, because then you’re no longer
just working with data you’ve already collected. You get to also work with data
that hasn’t been collected. You get to ask questions about why: Why is this happening? Now, when Netflix did this, they unlocked a whole new way
to transform their business. Netflix is known for their really great
recommendation algorithm, and they had this $1 million prize
for anyone who could improve it. And there were winners. But Netflix discovered
the improvements were only incremental. So to really find out what was going on, they hired an ethnographer,
Grant McCracken, to gather thick data insights. And what he discovered was something
that they hadn’t seen initially in the quantitative data. He discovered that people loved
to binge-watch. In fact, people didn’t even
feel guilty about it. They enjoyed it. (Laughter) So Netflix was like,
“Oh. This is a new insight.” So they went to their data science team, and they were able to scale
this big data insight in with their quantitative data. And once they verified it
and validated it, Netflix decided to do something
very simple but impactful. They said, instead of offering
the same show from different genres or more of the different shows
from similar users, we’ll just offer more of the same show. We’ll make it easier
for you to binge-watch. And they didn’t stop there. They did all these things to redesign their entire
viewer experience, to really encourage binge-watching. It’s why people and friends disappear
for whole weekends at a time, catching up on shows
like “Master of None.” By integrating big data and thick data,
they not only improved their business, but they transformed how we consume media. And now their stocks are projected
to double in the next few years. But this isn’t just about
watching more videos or selling more smartphones. For some, integrating thick data
insights into the algorithm could mean life or death, especially for the marginalized. All around the country,
police departments are using big data for predictive policing, to set bond amounts
and sentencing recommendations in ways that reinforce existing biases. NSA’s Skynet machine learning algorithm has possibly aided in the deaths
of thousands of civilians in Pakistan from misreading cellular device metadata. As all of our lives become more automated, from automobiles to health insurance
or to employment, it is likely that all of us will be impacted
by the quantification bias. Now, the good news
is that we’ve come a long way from huffing ethylene gas
to make predictions. We have better tools,
so let’s just use them better. Let’s integrate the big data
with the thick data. Let’s bring our temple guides
with the oracles, and whether this work happens
in companies or nonprofits or government or even in the software, all of it matters, because that means
we’re collectively committed to making better data, better algorithms, better outputs and better decisions. This is how we’ll avoid
missing that something. (Applause)

Comments (68)

  1. Hey first like
    First comment
    9th veiwer

  2. Well yes, statistics don't show individual stories and therefore don't always reveal the entire story.

  3. I've watched this from Ted somewhere

  4. >Nokia didn't follow my orders and because of that they nearly went bankrupt

  5. legalize weed 😛

  6. Stupid Nokia 🤦🏻‍♂️

  7. Nokia must be kicking them selfs.

  8. It's amazing how history repeats itself. What big data needs is more Philosophy and Scientific Theory. More inductive, rather than deductive models. She was trying to say that, but I guess she hasn't really sat down to think about the problem, which is ironically the same problem she's alluding to.

  9. Everytime she says 'expecially', a kitten dies.

  10. I loved the talk and the presenter. She exudes's personality, but the shoes do nothing for me

  11. Do you know who's beautiful? Read the second word in the first sentence.

  12. Does anyone believe that Netflix recommendation system is good? I hate it and I obtain my recommendation outside their environment

  13. Big Data is the Cake Mix Isle at your Local Supermarket.

    Consumers want Pie, because Cake is a Lie.

    This is the paradox.

    Did you really need a TED talk to say that?

  14. Yeah, it's a female speaker but listen to what she has to say. She has fascinating points that make you think about how data collection can cause people to make wrong decisions without using social justice to reinforce her points

  15. tl;dr
    T H I C C D A T A
    imma pass this to my unsupervised algorithm that i run on a distributed super computer on the based framework of Excel sheets. So much insight.

  16. Do you know what is Beatiful ?

    take the number of letters from the first word, then divide by 0.5 then take the number, and count the words, where do you end ?

  17. Those are some ugly fucking shoes lady. My human insight says to throw those fuckers in the trash.

  18. Videos like this are exactly why ted is so brilliant and so important. Really great insights and so well presented. Keep doing what youre doing!

  19. I would have liked her to use a little time explaining why quantity in data is good as well, instead of just bashing it. If it really is this bad, and not just her collection of anecdotes, she should have made her case better. Quantity in data is used >>so much<< today that finding cases where it has been misused or treated as the sole truth are bound to happen often. The interesting discussion is to what extend "thick data", as she buzzwordedly call qualitative data, works better than quantitative data. This discussion is nothing new. What is new is taking a step back from the "big data hype", but I don't think we need a "thick data hype" instead.

  20. Nicely done, Ms. Wang! Thank you for sharing.

  21. Is "thick data" anything else than qualitative data?

  22. thanks for sharing ^^

  23. I was expecting to really like this talk. Instead, I was disappointed at the lack of substance. She only gave one example of the problem, involving a company that clearly rejected her findings, and offered to hire more of her profession as a solution. The worst part is that I agree with her but she didn't even try to make a case for her point. Weak.

  24. Using small subjective data (cherry picking within success cases; Survivorship Biaso) to prove that you need new type of inviduals (exactly the ones like her) to interpret how to do do the big data the right way (without bias)…

    At the end humans are way too subjective and lazy to be right most of the times. We looks into numbers and data mostly for confirmation. We don't want to here all the time "it depends". We want nice variables which can be easily transformed, filtered and presented to prove that we are right. You can't solve this problem by relying on single individual (as proposed in video). More logical (but still not best) is to have a large diverse team, where each member brings unique subjective perspective.

  25. Big data without people who know who to read it is like the collected works of Shakespeare in a frat house.

  26. Thought her voice would be better. It's just a boring voice. Like that of a librarian in the midst of contemplating suicide. A real disappointment to be sure.

  27. Sounds like her thick data theory & "Everybody Lies" would dovetail nicely to help assuage the perplexities of the social & marketing engineers. It's just the thing needed for our deontological ethics & our special interest society.

  28. This phase is inevitable.. future is kinda scary that you are sucked into this system,and it will kill you if you don't.. everything happening around you has indirect monetary profit. It includes how a person,think, behave..

  29. 6:45 android is linux. And why did she use three grays.

  30. Not a good orator,,,, it was indeed very tough to comprehend what she wanted to convey?

  31. True intent of the video.
    Future Data Analyst: Hay look!, humans long ago listening to a women claiming she predicted the future and able to predicts human action!
    Future audience: Ha,ha,ha,ha These people are stupid.
    The cycle forever continues. There is no future, there is not end. It is all a constant.

  32. "what is missing from our models"
    path for the AI

  33. I dont think you have this right…what you should be saying is that the culture within Nokia did not want change and refused to accept data from the market which did not support their current norms….we agree nokia was never going to suceed when microsoft took over.

  34. Perfect example is Google/YouTube. Is using filters that give people things they've already seen a good idea? Now we have the stupidest net where it's hard to find anything anymore. Then the last few days, YouTube changed background colour to pure white, so now you can't see if a which tab you are currently on. Stupid programmers/system analysts.

  35. TED! You've motivated me to start YouTube and I'm only at 2,997 but I'm loving making videos a ton. Thank you!!

  36. what is she selling?

  37. There is a simple and naive way to predict the future: create it now.

  38. OMG such a genius ! (sarcastic) this is the kind of self inflating ego Ted x that are painful to watch

  39. never heard of the term (thick data), but thro my master studying have learned few tools that doesn't only rely on statics method but also integrate it with qualified data. must say it is an interesting talk

  40. It's a shame Chinese people wanted to get smartphones so much, because they reduce your IQ and concentration skills. https://www.laboratoryequipment.com/news/2015/03/smartphone-use-linked-intelligence https://www.laboratoryequipment.com/news/2015/03/smartphone-use-linked-intelligence http://health.usnews.com/health-news/articles/2015/03/06/could-smartphones-lower-intelligence

  41. Binge-watching, in addition to smartphones, is not necessarily beneficial to humanity; it creates a lazier, more distracted society, engrossed in their screens instead of talking to each other.

    Sure , these things improve the numerical gains of the companies, but at the expense of having a negative effect on society.

    So, in the end, you need to look at the thick goals as well. Not just having numerically big goals of 'we''ll earn this much', but actually having value-, result-driven, ethical practises that produce a better society.

  42. trDucir al español

  43. HARD data, BIG data… and now, THICK data

  44. why weren't the temple guides getting stoned?

  45. Great talk! I've been saying this for years. Science has a hard time with nuanced data, and scientists tend to assume it either doesn't exist or is not important, when it can be highly important and effect everything.

  46. lol Watson is a super computer not "big data" itself…

  47. Her studies are needed, but not for now, machines should be based on precise quantitative data.

  48. This could've been an amazing talk if the presenter would've spent more time on the nitty gritty as opposed to spending 15 minutes telling a defunct company I told you so. 8/10

  49. I'm studing anthropology in a country with almost inexistent funds for social sciences studies and with a very narrow view of how anthopology must be applied, but, watching this video has opened my eyes. My career is not useless or limited as many people had told to me all this years, but it has many possibilities, outside what my university aims to, and that I can do bigger things. Thank you, TED!

  50. 中国好的地方先进的地方你不说 光用一些丑陋面 SHAME ON YOU

  51. Any links to the sources of these claims? 73% not profitable etc. I'm doing a school project about big data.

  52. She needs to stop giggling so damn much

  53. 冒个泡,英语老师要我们做口语,我选的这个。我觉得这个演讲给了我新的想法,对我挺有启发的。不过上面的评论好像对她的演讲(不是观点)不太满意。。。。

  54. English subs are missing

  55. I want one of those phones

  56. 一叶障目不见泰山

  57. Companies don't want to miss opportunities. They hiring data scientists and start gathering data. And the next thing they realize is that they are sitting on a huge pile of data that cannot be used to deduct anything even close to business values. Their data scientists are so unchallenged and all wondering why they are hired. The CEOs of these companies should have started by watching this video.

  58. "Quantifying is very addictive…", this is so true.

  59. I have a solution to all of our problems. Stop micro managing humanity, your causing great harm. I hear this, "We cant read all the data. How can we efficiently exploit people?" Just how much soul are you willing to exchange for market viability?

  60. Wow so that’s why that kid sold his organ for one!

  61. Simply amazing! Thanks prof. Wang!!

Comment here