Secrets of the $4 Billion dollar business model of “Pokemon Go”

The idea of “Surveillance capitalism” is quite mysterious (and perhaps sinister?). Google has pioneered this as a business model for the last 20 years. We used to search for bargains online. Now advertisers are “searching us”. Where we visit in the online world as well as what we search for and the relationships with others are all combined to create a profile. Online games have provided a huge source of this information. Google has pioneered these strategies in games like Pokemon Go. Other social networks have used similar principles. This post looks into this model and the implications of its approach.

An excellent book on this subject is “The Age of Surveillance Capitalism” by Shoshana Zuboff and raises some thought-provoking questions on the future.

Surveillance Capitalism

Ecommerce sites are no longer waiting for customers to go to them. Behavioural analysis of metadata harvested from social media allows the targeting of customers.

Do you remember the Social media company Foursquare, that pioneered the gamification of mobile check-ins? Maybe not, but don’t worry they have not forgotten about you :). In fact, Foursquare is used by Uber, Apple Maps, Airbnb, WeChat, and Samsung phones etc you get the picture! This data combined with your browsing habits reveals your personality, preferences and everything else about you.

This harvesting and profiling with the unwitting consent of most users are developing the behavioural analytics market. Big social media companies like Google, Amazon and Facebook have been developing this technology for years.

Political campaigns have used the same predictive technology to influence and persuade a targeted audience.

Pokemon Go

The 2016 Pokemon Go sensation took the world by storm. Armies of Pokemon enthusiasts swarmed their surroundings to capture Pokemon. This seemingly innocuous but addictive game caught the imagination of the masses. The mantra to “catch em all” was soon echoed by people who had never heard of Pokemon. The gamification element made people competitive and peer pressure drew people in.

Parents were happy to encourage their children to go for long walks to capture these virtual creatures, It was even fun. Educators were endorsing the healthy mixture of outside exercise and technology and indeed these are all good things.

$1 Billion business model

So how did Niantic’s generate $1 Billion dollars of revenue in one year in 2016?

When you downloaded the Pokemon Go app from the App store you agreed to a long list of terms and conditions. The amount of personal data you gave away may surprise you. This data continues to be gathered to this day by Pokemon Go. (You deleted the app right?)

With new products this time capitalising on the Harry Potter franchise and ambitious plans in augmented and virtual reality for their products. Last year, Niantic labs were valued at a cool $4 Billion.


Gamification using subtle persuasion is a powerful way to influence your market. First of all, in Pokemon Go, there is an incentive to “catch em all” anyone familiar with the cartoon series is familiar with this. Secondly building XP (experience) and hatching eggs for Pokemon encourages walking around and interacting with other players. The constant notifications making you aware of Pokemon are all packaged in an attractive app. Finally, it becomes quite competitive to compare Pokemon with friends and have battles.

Niantic implemented a persuasive app that is engaging to use. This didn’t just start in 2016 it was part of a longer-term strategy starting back in 2001.

Background to Pokemon Go

From Niantics own website:

2001 – The origins of “Pokemon Go” start back in 2001 when a small team of scientists, gamers, cartographers and AI researchers developed an interactive 3D map of the World. This project became a game called Keyhole.

2004 – “Keyhole” was acquired by Google and renamed Google earth.

2005-2009 – Google continued to develop the 3D modelling of hundreds of cities, countries and planets. Which over time produced the product we know today as Google Maps, Street view, Sketch up and Panormaino (which Google shut down in 2016).

2010 – Google founded the startup “Niantic labs”. The aim was to leverage mobile devices and maps to achieve three stated purposes:

  1. Exploration and discovery of new places
  2. Exercise
  3. Real-world social interaction with other people.

However, the terms and conditions that few people read had several unstated purposes. Such as access to personal data, access to contacts, access to location etc. These permissions create a gold mine of personal data that is generated by each player. Niantic labs are then free to sell on to third parties as behavioural intelligence.

2012 – Niantic labs release Field trip a location-based mobile app which helps you find cool, hidden and unique things in the world around you.

Later the same year they released Ingress which was an augmented reality game of mystery, intrigue and competition.

2015 – Spun out of Alphabet Inc, as an independent private company with $35 million in Series-A funding from The Pokémon Company Group, Google, and Nintendo. That year, they also announced the development of Pokémon GO in collaboration with The Pokémon Company and Nintendo.

2016 – Pokemon Go launched in 2016. In just 2 months, Trainers around the world collectively walked over 4.6 billion kilometres, 3 months later it was 8.7 billion kilometres.

Before 2016 came to a close, Pokémon GO had generated more than $1 billion in revenue, crossing that threshold faster than any app in history.

Harvesting of data

The hidden intention deep in the terms and conditions is to harvest information from players and provide that to local and corporate businesses to encourage real-world purchases through promotions. How about a Pokemon Starbucks after a long Pokemon battle. Or a Happy Meal at McDonald’s?

What Niantics had successful engineered was access to vast data pool of people’s movements and metadata. The GPS data showed how players travelled to locations and how long they stayed and what their interactions were with other players. This combined with promotions enabled them to sell this data and have a turnover of $1 billion in the first year.

The market for such behavioural data is huge if a company can get quality information about an individual who with a high degree of probability purchase their product, this is gold dust. Behavioural marketing is nothing new but the digital medium allows so much more granularity.

Earning revenues not by “pay per click” but “pay per footfall”.

Real-world, real-time markets which create sponsored locations for Pokemon’s to appear. This created a cash cow of footfall where businesses could benefit from the Pokemon promotions.

Companies are keen to become Pokestops or Gyms and so generating footfall. This was a cost per visit model rather than Googles cost per click. The lure of a virtual reality world brings customers into real-world shops to make purchases. Some Starbucks shops became Gyms for Pokemon battles.

The zeal for Pokemon Go has waned but the data from the game is still available to Niantic. This gamification created massive behavioural analytics to shape peoples behaviour (hunt for Pokemon) and influence their purchasing habits i.e Pokemon Latte or McDonalds Happy Meal.

The App privacy policy gave access to the camera and microphone, access to contacts with the ability to share location data. There are no limits on how long it would keep the data it harvests from a users phone or restrictions on the third parties that would receive the data.

On one level Niantics provided Game services for players and another level it provides prediction services for Niantics third parties.

The genius was to transform the game you see to a bigger game of harvesting of behavioural data for financial gain.

Win-Win what’s the problem?

On one level everyone wins, gamers get the thrill of participating in a game with their friends. People get healthy exercise, exploring their neighbourhoods. Local businesses get more business from players and can cross-promote their services to them.

On another level unknown, unaccountable third parties have access to large amount of personal data. If you want to know exactly what data Niantic hold and you live in the EU then you can do a Data Subject Access Request under GDPR like this reddit user did. He came up with a nice summary of the main data that Niantic hold.

What is troubling about this approach are the precedent that it has set for future apps and business models:

1. Lack of transparency

Firstly the lack of transparency of who the third parties are, what they do with the data and who is holding them to account. Due to the multiple layers of marketing, it is hard for a user to know who to hold to account. It is also clear that it is also easy to identify real people from just a few pieces of information. It is possible to pinpoint almost anyone i.e. gender, age and marital status from anonymised data.

Cookie tracking has long been used to follow people around the internet. The third-party advertiser can harvest all this information about you together to form a very detailed profile of you. All without your consent or permission

2. Tyranny of the default

Most users regard the long-winded Privacy Policy with its hard to read terms and conditions as “Clickwrap”. Where “clickwrap” means that most users simply accept all the terms and get on with the game. Only the most die-hard legal types are going to bother to read. The tyranny of the default means no choice but to accept the conditions. This traps the user into disclosing and relinquishing their personal data.

3. You are the product

The ability to take multiple data points from any individual enables third parties to market with a high degree of accuracy. Knowing their age, location, friends, the way they travel around: by foot, public transport, car etc a lot can be understood of an individual.

For many users being the product is not a problem if they are getting good quality services such as email, insurance, healthcare etc. But what if a healthcare company charges you a really high premium because they know that you most likely have a terminal disease you don’t yet know you have?

It goes beyond users just being the product, users are not aware of the data they have given away. The certainty of who will make a purchase is driving the marketing of products. Which is okay for non-essential purposes but when it affects who will get healthcare or insurance this becomes life-threatening.

4. Manipulation of personal data

The manipulation of personal data without explicit consent. The ability for big tech companies to exploit deep knowledge of an individual’s tastes and preferences to persuade them to purchase products and services.

At least in the EU there is some protection afforded by GDPR in the sense that explicit consent must be obtained. However, in most of the World, such laws are not present and users do not have protections.

Large companies such as Facebook are flouting best ethical practice. For example in 2014, Facebook conducted an experiment to see the effect of people’s mood by manipulating posts making them more positive on negative. “Newsfeeds” were been manipulated without the users’ knowledge or consent. For many people, Facebook is a source of much of their information. Facebook justifies this approach because any user has accepted the terms and conditions that Facebook “may use the information we receive about you…for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” But this kind of manipulation is completely unethical and has disturbing consequences. What if someone went on to take their life because of a manipulated news feed etc.

Let us not forget the Cambridge Analytics scandal where Facebook again harvested peoples personal information for political purposes without consent. Despite all the furore and complaint little appears to have changed at Facebook.

5. Commerce is moving from human discretion to AI certainty.

A lack of a human touch due to the automation of “Artificial Intelligence” decisions is dangerous. The robo mortgage broker that decides you are not eligible for a mortgage or the loan repayment company has no sympathy for the loss of your job.

Where are the checks and balances to intervene when the wrong decisions are made?

There is the story of an elderly couple who could no longer afford the car payments on their 1999 car due to spiralling medical costs. The repossession man was sent to repossess the car. He repossessed the car as he was required to, but wanted to help this elderly couple. He paid off the car loan by starting a GoFundMe campaign that raised $25,000 to pay off the debt and help out the couple. This kind of story would not be possible if it is simply logical AI making the decisions.

In a world that is increasingly more automated the repossession decision is an automatic one. Where are the checks and balances to intervene when the wrong decisions are made? Or the need to show love, kindness and generosity should be there. We are moving to an ever-increasing highly connected but impersonal world.


Innovative games like Pokemon Go push the boundaries of augmented reality as well as the legality of harvesting of personal data. Technology is always going to be way ahead of existing regulation. It is important that along the way we don’t lose some of the best commerce principles of trust and fair play in place of profit. We still need to appreciate the great things about human relationships such as love, intimacy, compassion, mercy, grace, generosity, privacy and protection to name but a few.

The world is becoming ever more connected and fire in Australia, a plane crash in Iran or the repossession of elderly persons car in the US can be communicated within seconds. A future which is predicted by AI where relationships, opinions and purchases become controlled is a dangerous one.

There needs to be the right checks and balances in place and hidden practices challenged. As a digital customer, we need to hold Google, Amazon and Facebook to account.

1 Response

  1. February 14, 2020

    […] companies like Google, Facebook, Apple, Netflix etc. The use of persuasive technologies and understanding of our preferences is pushing us in the direction where we are simply a data point that is being manipulated to make a […]

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.