How to Innovate with Data – API First Development

Does your company know what to do with all the data you collect? 

Are you too slow to innovate?

Do you spend your days trying to make a case for enhancing operations or building completely new business opportunities using data? Have you heard the word API, but scared it will take too long to discover what APIs to build?

API first = business and users first

Your competitors, customers, and partners are catching up with you and striding past as you are reading this article. Your bosses want you to build a killer app, but guess what? Users are not going to line up for it, just because you publish it. Users want stuff to work and data to flow, they don’t need “yet-another-UI”. But you need to develop something that adds value and you need to do it fast. Mission impossible? 

The API first mindset help companies:

  1. Focus the data innovation efforts in areas which deliver actual business value
  2. Enable rapid development cycles
  3. Reduce cost
  4. Increase value

Venice Floods + Satellite Images = Insurance Scam?

Imagine you are a manager at an insurance company handling the insurance claim aftermath of the sudden massive flood in Venice, Italy. Which of the thousands of have been really hurt by the flood and should be compensated? 

Using a quick process from APIOps Cycles -method trained and consulted by Osaango and tools from WhereOS to try out a solution in a matter of days:

1) Identify the business problem by starting from the customer & employee needs 

Look at the customer and employee journey to create a jobs-to-be-done list of tasks, example in this case: “Verify the extent of flood round a building with a flood claim”. 

2) Get your software developers or data scientist involved and figure out what ready-made data and functionality would bring you gains:

 “If we had satellite images from the flooded areas during the flood and an API to read the metadata like geolocation and the ability to match it with a street address of a building this would be very easy”.

Analyze any pains that you might have to use those data and APIs. Get your teams or partners, vendors etc. to suggest you APIs with features to solve those problems.

3) Identify all needed internal and external APIs meeting the required gain giving and pain-alleviating features

For example Point API for Flood Level for detecting the current flood level. The level can be based on ESA satellite data and an API for flood history analysis.

4) Create a cost-benefit analysis, aka the business case or model

You can use the API Business Model Canvas to do that very easily. Just make sure you have the right business and tech people in the room to have that discussion.

5) Build the MVP with a few clicks and verify it with a real case

Let’s take some data from ESA (European Space Agency) Sentinel 2 satellites. Start using the  WhereOS application and expose the APIs easily from the data set.

Test the system in a real-life example. You can use actual claims and manually inspected flood levels in different areas. You can make sure the system delivers the results as expected. Testing is light and efficient and allows a rapid reaction to pain points and continuous development.

Iterate: Go back to the business problem and measure against your KPIs hat your solution is actually delivering the value you wanted. Rinse and repeat and you will find out even better sources of cost savings or new profit. You can find out that you need to amend the data with other types of information (e.g. local weather data in the case of floods). This will help the system to deliver better results.

An example of a business model that an insurance company could come up with instead of just reducing the immediate costs: Insurance companies can classify claims based on the location of the property and detected flood levels.  By using the classification companies can define which claims can be automated which need manual process. Insurance companies could offer historical flood risk analysis as a new service offering. The information would be valuable for property investors and insurance product designers.

Pic 1. Satellite radar data processed into a picture with WhereOS of Venice floods in November. Green and yellow areas represent areas that have been flooded. Data source: ESA Sentinel 2 Satellites.

Our message to you: Innovation needs technology, but shouldn’t be driven nor prevented by it

Too often building APIs and using data is turned into technology-driven proof-of-concept projects and the projects are driven by technical teams. It’s also common that discussion revolves around technical solutions. The connection between technology and business is missing.  The focus should be on proving and maximizing the business value and creating outstanding customer experiences with minimal investments into technical solutions. A business case has to be solid enough before starting to build an actual platform.

The partnership between Osaango and WhereOS strives to turn the process around. We focus 100% on proof-of-value projects, where we iteratively develop solutions with API first mindset. We help customers to focus on business goals and see underlying technical solutions and various systems through APIs. The API first mindset also ensures the rapid development of all the layers in the solution.  Teams can develop their solutions independently by using the jointly agreed APIs.

Contact us

Founder of APIOps Cycles Method and co-writer of API Economy 101 book
Marjukka Niinioja, Osaango Ltd,

More information:

API development and WhereOS, Chief Data Scientist, Jussi-Pekka Partanen

More information:

WhereOS and EEE Innovations Speed Up Data Science Solution Development for Automotive Data

Speeding Up Development and Deployment of Solutions for the Customers of EEE Innovations

EEE Innovations products and services featured in The Advancements TV series with Ted Danson View the full video.

EEE Innovations Oy develops E3 ForeC and E3 Sense products for passenger cars and heavy traffic to collect and refine data from CAN bus of the vehicles, such as accurate location, outdoor temperature information, unexpected braking situations, vehicle diagnostics information and CO2 emissions. The E3 solution produces information from vehicles to develop data driven products and services for drivers, fleet managers, authorities, road maintenance operators, insurance companies, vehicle computer manufacturers, self-driving cars and other road users.

The need of EEE Innovations is to create specific analyses, visualizations, dashboards and applications of the data they collect their customers’ fleets of vehicles. The goal was to speed up innovation work – how can we speed up the cycle of identifying customer needs and rapidly creating analyses, visualizations and applications that can be used for marketing and as a proof-of-value concepts for customers, to get more brand exposure and actual sales leads for EEE Innovations.

Cross Functional Kick Off Day

The process started with WhereOS and EEE Innovations hosting together a workshop day, with EEE cross-functional team with people from different backgrounds including data science, R&D, marketing, communications and management. The goal was to kick-start the development by identifying the value adding use cases from the customers and marketing point of view, and with the help of EEE Innovations & WhereOS technical teams rapidly develop the first applications to match the identified use cases.

The result of the kick off day was excellent both from EEE Innovations and WhereOS point of view: “It was amazing to see how fast our team picked up the idea of rapidly creating new kinds of data visualizations, that actually help us to sell our products & data better to our customers”, says Jarmo Leino, CEO of EEE Innovations. “I was impressed by the team at EEE Innovations, and how they could innovate new ideas and develop and deploy new applications just within one day, to create something new and interesting for customers”, says JP Partanen, CEO of WhereOS.

Geospatial Time-Series Data Development by EEE Innovations

After the kick off day, EEE Innovations continued the development with the lead of the EEE Innovations CTO Paula Silvonen: “Using WhereOS for creating value adding analyses and visualizations of geospatial time-series data from our EGRIP data was easy and definitely produced a productivity boost for our team”. EEE Innovations team continues to use WhereOS as a part of their data science tool-chain, and plans to extend their work to also participate in the WhereOS Ecosystem.

EEE Innovations Featured in the Advancements TV series and Expanding to Passenger Cars

The world-leading work of EEE Innovations was recognized by ForeC and EGRIP systems being featured in an episode of Advancements TV with Ted Danson on CNBC. Some of the time-series visualizations implemented with WhereOS are shown in the TV episode.

“We continue to work together to create new innovations together with WhereOS in this space – there’s more to come especially in the area of passenger cars so stay tuned”, says Jarmo Leino, CEO of EEE Innovations.

Jarmo Leino, CEO, EEE Innovations:,
More info: Twitter LinkedIn

JP Partanen, CEO, WhereOS:,
More info: Twitter LinkedIn

Kiinteistömaailma, eCraft and WhereOS analyzed the house price development in Finland

Kiinteistömaailma, eCraft and WhereOS implemented together an analysis of house price development in six largest cities in Finland. The raw data produced by Kiinteistömaailma was analyzed with eCraft Property Valuator-tool, and visualized into a heatmap video using WhereOS. 

House price development in the capital area of Finland (see videos for other 5 largest cities below). Green = +5% yearly price increase, Red = -5% yearly price decrease. 

The price development videos clearly show, how the prices dropped during years 2008 and 2009 due to the world wide financial crisis, but after that the price development has been steady and positive. Regional development in the capital area, but also in smaller cities has varied a lot: the city centers usually show as green (price has increased), whereas red areas appear mostly further away from the city centers. “The heatmap animation produced by eCraft and WhereOS shows in an interesting way how real estate prices have development during the last decade in the largest cities of Finland. It shows clearly how price levels change in different cycles not only between large cities, but also in different areas within the cities”, says Taina Mustamo from Kiinteistömaailma.

Hyperlocal Price Indexes with eCraft Property Valuator

eCraft Property Valuator is a software/data/artificial intelligence as a service, which offers the best estimates and predictions for real estate price development in Finland. “eCraft’s Property Valuator is a unique model, and an excellent tool for calculating hyperlocal price indexes.”, says Jussi-Pekka Partanen from WhereOS.

More accurate description (in Finnish) can be found from the blog article: “Kuinka kehitimme Suomen parhaat asuntoennusteet osa 1

Visualizing Heatmap Videos with WhereOS

The data from eCraft Property Valuator was visualized using WhereOS heatmap rendering function. The heatmap was calculated by dividing Finland into very accurate grid and calculating the average house price for each of the grid cells for each year and month.

The monthly grid data was converted into a visual form using the heatmaprender-function ja combined into the final video file using rendervideo-function. In total, processing the price data from eCraft Real Estate Price Analysis into grid data, visualized heatmaps, and final video required roughly 10 lines of code in WhereOS. “WhereOS is an excellent tool to analyze and visualize geospatial timeseries data”, says Tommi Vilkamo from eCraft. ”It has never been possible to do such an accurate and lively visualization of real estate market data before.”

House Price Development in Six Largest Cities in Finland

You can see the videos of six largest cities using the links below:

House prices in capital region: Helsinki, Espoo, Vantaa
House prices in Tampere
House prices in Turku
House prices in Kuopio
House prices in Oulu
House prices in Jyväskylä

More information

For hyperlocal price index development contact Tommi Vilkamo, Chief Data Scientist at eCraft,

For geospatial timeseries data analysis and visualization of heatmaps and videos contact Jussi-Pekka Partanen, Chief Data Scientist at WhereOS,

Kiinteistömaailma, eCraft ja WhereOS analysoivat asuntojen hintojen kehityksen Suomessa

Kiinteistömaailma, eCraft ja WhereOS toteuttivat yhdessä analyysin Suomen kuuden suuren kaupungin asuntojen hintakehityksestä. Kiinteistömaailman tuottama asuntojen hintojen raakadata analysoitiin käyttäen eCraftin Asuntopuntari tuotetta, ja visualisointiin lämpökarttavideoksi WhereOSin avulla.

Asuntojen hintakehitys pääkaupunkiseudulla (alla muut viisi Suomen suurinta kaupunkia). Vihreä = +5% vuosittainen hinnan nousu, punainen = -5% vuosittainen hinnan lasku.

Hintakehitysvideoilta (ks. muut Suomen kaupungit alla) voi selkeästi nähdä, kuinka vuosina 2008-2009 asuntojen hinnat laskivat finanssikriisin seurauksina, mutta siitä eteenpäin hintojen kehitys on ollut positiivista. Alueellinen kehitys, myös Helsingissä, mutta erityisesti pienemmissä kaupungeissa on ollut vaihtelevaa, mikä näkyy siinä että kaupunkien keskustat ovat pääosin vihreänä (hinta kasvaa) kun taas punaisena näkyvät alueet ovat keskustasta kauempana olevia kaupunginosia. “eCraftin ja WhereOSin tuottama animaatio kuvaa kiinnostavasti, miten asuntojen hinnat ovat kehittyneet viimeisen reilun kymmenen vuoden aikana maan isoimmissa kaupungeissa. Se näyttää havainnollisesti, miten hintatasot kehittyvät eritahtisesti myös kaupunkien sisällä, ei vain kaupunkien välillä”, sanoo Taina Mustamo Kiinteistömaailmasta.

Hyperlokaalit hintaindeksit eCraftin Asuntopuntarilla

eCraftin Asuntopuntari on  software/data/AI as a service -palvelutuote, joka tarjoaa Suomen parhaat arviot asuntojen hinnoista ja niiden ennustetusta kehityksestä. “eCraftin Asuntopuntarin uniikki laskentamalli on erinomainen työkalu hyperlokaalien hintaindeksien laskemiselle”, sanoo Jussi-Pekka Partanen WhereOSista.

Tarkempi kuvaus eCraftin Asuntopuntarista löytyy artikkelista “Kuinka kehitimme Suomen parhaat asuntoennusteet osa 1

Videodatan visualisointi WhereOSilla

eCraftin asuntopuntarin data visualisoitiin käyttäen WhereOSin lämpökartan (heatmap) visualisointifunktiota sekä videon tuottamisfunktiota. Lämpökartta laskettiin jakamalla Suomi erittäin tarkkaan ruutuaineistoon ja jokaiselle ruudulle laskettiin Asuntopuntarin antamien asuntojen hintojen keskiarvo jokaiselle vuodelle ja kuukaudelle.

Kuukausipohjainen ruutuaineisto muutettiin visuaaliseen muotoon WhereOSin heatmaprender-funktiolla, ja yhdistettiin rendervideo-funktiolla lopulliseksi videoksi. Kokonaisuudessaan aineiston käsittely kuukausittaiseksi ruutuaineistoksi, lämpökartan visualisointi ja videon tuottaminen vaativat n. 10 riviä koodia WhereOS:issa. “WhereOS osoittautui mainioksi työkaluksi paikkatieto- ja aikasarjadatan käsittelyyn ja visualisointiin”, sanoo Tommi Vilkamo eCraftilta. ”Asuntomarkkinoiden kehitystä ei ole pystytty koskaan ennen näkemään kartalla näin tarkasti ja eläväisesti”.

Asuntojen kaupunkikohtainen hintakehitys

Alla olevista linkeistä voit katsoa asuntojen hintakehitysvideot kuudessa Suomen kaupungissa:

Asuntojen hinnat pääkaupunkiseudulla: Helsinki, Vantaa, Espoo
Asuntojen hinnat Oulussa
Asuntojen hinnat Kuopiossa
Asuntojen hinnat Tampereella
Asuntojen hinnat Jyväskylässä
Asuntojen hinnat Turussa


Hyperlokaalit asuntojen hintaindeksit, Tommi Vilkamo, Chief Data Scientist, eCraft,

Maantieteellisen aikasarjadatan analysointi visualisointi lämpökartoiksi ja videoiksi, Jussi-Pekka Partanen, Chief Data Scientist at WhereOS,

Predicting Commodity Price Movements with Kontoor AI and WhereOS

Each day thousands of news articles are published by large news agencies. Reuters alone publishes over two million articles each year. On any given day, the world looks very confusing based on the news flow. Some companies are optimistic and increasing their investment, while some firms are going bankrupt.  Even for a professional analyst who has time and resources to devote for making sense of the economy, the job is overwhelming. Kontoor is a cloud based neural network that automatically reads in in real time each published news article published and creates an un-biased analysis of the news. The system stores all economic and corporate data from the news and makes sure all relevant information is captured from the news.

A news dashboard view and a predictive model developed with Kontoor and WhereOS

The challenge of automated data collection has traditionally been related to the fact that English language is extremely rich in vocabulary and expressions. Each event or phenomenon can be expressed in dozens of different ways, which makes traditional search engines ineffective. “Many platforms are capable of detecting synonyms for nouns and adjectives. Kontoor is the only platform capable of understanding the semantics for the verbs as well. Phrasal verbs are essential on English language. For instance, think of dozens of expressions there are for something increasing?”, says Jukka Taskinen, CEO of Kontoor.

“With Kontoor and WhereOS technologies, we’re able to engage the power of artificial intelligence or more precise advanced semantic neural networks extracting important business information from the news, and train machine learning algorithms e.g. to predict commodity price movements for different industry sectors”, says JP Partanen, CEO of WhereOS.

The collaboration between WhereOS and Kontoor leverages the strengths of each company: Kontoor’s ability to turn unstructured news into quantitative information, WhereOS’ abilities to integrate and operationalize multiple data source and develop machine learning models, API and applications on top of the underlying infrastructure.  “We have been extremely satisfied with WhereOS, as it has helped us to speed up our development of APIs and UIs based on our data with the factor or 5x”, says Jukka Taskinen, CEO of Kontoor. “It has been great to work with the truly talented team of Kontoor, and see how powerful Kontoor semantic artificial intelligence technology can really be in understanding and documenting the news feeds”, says JP Partanen, CEO of WhereOS.

Extracting Information from Unstructured News Text with Kontoor

When we are reading the news, we are looking for “events” – something that is taking place. This allows us to form an idea of the events that are shaping the direction of the world. Kontoor mimics this by creating generic events for companies and markets. Generic events cover any common activities by companies like increasing investment, hiring people, nominating a CEO, adjusting dividend or legal penalties. Also, events such as floods, strikes, and weather can be identified that have an impact on daily business. Events can be identified for companies, products and countries. Further, Kontoor identifies time periods, people and numeric data that will help in analysis.

During any news day, there is an endless stream of small stories about the companies you have never heard of and hence these stories are dismissed as insignificant. In other words, you are losing almost 100 percent of information in the news. All stories carry small pieces of data about how the business is performing and how they see the future. For instance, a story about Beiersdorf says they have increased sales and market share in their 3Q earning release.

“Beiersdorf, the maker of Nivea skin creams, said it snatched market share away from rivals as it posted a 6 percent rise in organic group sales in the first nine months.”

It is very difficult and labor intensive to read all the news and store it in a meaningful manner to be used later. Kontoor, on the other hand, is able to automatically collect and process such data about all events related for example related to selected corporations, and create a fully documented history for each firm.  You may not be interested in this company, but aggregating events data for the industry, it is possible to start understanding the big picture of the industry or product segment – trends become visible. By aggregating data for all industries, you will start seeing the trends in economy before the statistical organizations release data with a lag. Industry data can be divided into industry and services to predict PMI, industrial production and other similar indicators.

Kontoor Understands the Difference Between the Past and the Future

There is a significant difference in terms of economic value between the past events and the expected future events; Kontoor is specialized in understanding the difference of past and the future when processing the news. This enables Kontoor to gather automatically expectations related to commodity production changes, for instance, from the thousands of articles. Similarly, expectations for companies and markets can be gathered from the news. Therefore it is possible to find out a synthesis of opinions from the industry experts and companies

Let’s have a look at some practical examples: The airline company SAS expects its profits to improve as it reported 3Q earnings – a past event.  The steelmaker Thyssenkrupp sees its profitability to suffer. By gathering similar data for the industry, we can see whether these events are company specific or an indicator of an industry trend.

“Scandinavian airline SAS hiked its full-year earnings outlook on Friday as third-quarter profits topped market expectations.“

“Thyssenkrupp (TKAG.DE) cut the profit margin forecast for its capital goods business on Tuesday.”

Kontoor detects similarly expectations for sales, order, production, investment and so forth for any imaginable economic and corporate event.

Kontoor Semantic Artificial Intelligence Technology

Kontoor’s semantic deep neural network is trained by thousands and thousands of examples collected from the news; examples for each expression for corporate and economic events.  Based on the training material, network is being build up of multidimensional matrix of words that create clusters for each event we want to detect. Learning in the network happens as new expressions arrive that are close enough to some existing cluster. Network predicts the meaning of a new expression its proximity to exiting expressions.

Kontoor platform consist of core Kontoor technology and infrastructure around it – the infrastructure is similar to any search engine with artificial intelligence on top. Artificial intelligence consists of IBM Watson style, first generation component and a semantic neural network. Infrastructure is similar to any search engine to store indexed news stories.

Core technology is based on Stanford NLP library for natural language processing and different proprietary tools, models and databases, including machine learning model for sentiment analysis and deep neural network based models for events extraction. Training material for AI models constructed from huge unstructured news datasets, thousands of handcrafted rules and data engineering.

Infrastructure built around Kontoor technology consists of different Hadoop ecosystem technologies. Data storage organized with HDFS (Hadoop Distributed File System) and HBase. Data indexing and search features implemented with multi-server Solr Cloud installation and Cloudera HBase-Solr indexers. News tracking/fetching done by our own tracking module, streaming processing with Spark Streaming and Kafka messaging system. All together those technologies allow Kontoor fetch new data, process it, store and make results available with a minimal time delay. Kontoor platform is available via Play framework and ReactJS based user interface and Akka based API.

Creating Predictive Model and Application UI with WhereOS

WhereOS connects to Kontoor backend to receive a stream of quantitative data related to news (events, locations, time, sentiment, etc.). WhereOS can be used to create a layer of functionality on top of the data, by implemented different data processing pipelines in the WhereOS backend:

  1. Data APIs – The processed news data can be made available through APIs, and any new APIs create be dynamically created within minutes
  2. Combining the news data with other data sources – The quantitative data extracted from the news sources and be combined with other data sources, such as commodity price index information.
  3. Machine learning models – Data from the news can be used to train machine learning models (e.g. XGBoost in this case) against for example the commodity price index information (or any other quantitative data) and create ML model that can predict the e.g. the price index value based on processed news information feed received from Kontoor API.
  4. Application UI – Data can be easily visualised into different type of user interface applications for various purposes.

WhereOS uses Spark and Hive as the execution engine for the pipelines, which means the data processing is happening always as distributed computing, and can scale to billions of data points. WhereOS pipelines can be created using SQL and R programming languages including built-in functions for ETL (extract, transform, load) operations, statistical analysis, machine learning and artificial intelligence (AI), geographical and geospatial analysis, data visualization etc, which makes it really easy and simple to implement different data processing actions.

Jukka Taskinen, CEO of Kontoor:, +358 40 548 3780
More info:

JP Partanen, CEO & Chief Data Scientist of WhereOS:, +358 50 486 9257 Twitter LinkedIn
More info:

RoadCloud and WhereOS Collaborate on Automotive Data Collection and Processing

Traffic Jams in Helsinki

Are there traffic jams in Helsinki? People living in Helsinki would say YES, and people living in central Europe would probably laugh at this. The traffic jams are relatively mild compared to many other places but there are clearly times and locations where the traffic slows down significantly.

RoadCloud and WhereOS have joined forces to collect, analyze and visualize data collected from the fleet of commercial vehicles through RoadCloud sensors. “It’s amazing how detailed information you can collect from road network conditions through RoadCloud sensors”, says JP Partanen, CEO of WhereOS.

In the series of articles, we explain how the data can be collected from vehicle fleets and processed into models, that can further be used to solve different business problems such as traffic conditions, road conditions, or even creating machine learning / artificial intelligence model to predict these conditions based on other external variables such as weather forecasts. In this article, we explain how we created a video visualization of traffic conditions & jams throughout the day. “We are impressed with WhereOS, and how it quick it was to turn our data into an insightful video. We are integrating our data API to WhereOS and making our anonymized vehicle data easily accessible”, says Ari Tuononen, CEO of RoadCloud.

Capturing Vehicle Flow Data with RoadCloud IoT Sensors

RoadCloud has equipped commercial vehicle fleet with RoadCloud IoT sensors to collect and monitor vehicle data and road conditions. The sensors are automatically collecting basic information such speed, heading and acceleration, but more importantly information about road surface conditions such as road friction, road state (dry/snow/ice/water), temperature, bumps and pot holes as a few examples. The data collection is taking place 24/7, as the commercial vehicles are continuously throughout the day and night.

The sensor data is uploaded to the RoadCloud data backbone, where it is stored for further use & analysis. The data can be processed as historical data, or as a real-time feed of continuous updating data.

Analyzing and Visualizing the Sensor Data with WhereOS

WhereOS connects to the RoadCload data backbone and can process the data further. In this visualization, the RoadCloud data is processed and combined with OpenStreetMap (OSM) street network data in several pipelines:

  1. OSM data preparation pipeline extracts the OpenStreetMap street network for the desired region, Helsinki capital area in this case, and splits the streets into segments of desired length (e.g. 200m).
  2. RoadCloud data extraction pipeline loads data from RoadCloud data backbone, and assigns joins the GPS (latitude & longitude) data points into corresponding street segments. Joining uses WhereOS geokey/geohash based operation for matching massive amount of geographical shapes – GPS points and polylines/linestrings (street segments) – together.
  3. NTILE pipeline takes the joined data, and approximates maximum speed for each street segment, by taking average of the 10% highest speeds driven on that specific street segment.
  4. Hourly traffic pipeline calculates average speed driven for each individual street segment for each hour, and also the speed decrease from the maximum speed.
  5. Rendering pipeline takes speed decrease for each street segment for each hour and produces an MP4 video where each frame represents traffic conditions (speed decrease) for each street segment for the given hour. Each street segment is colored so that red equals to high speed decrease (i.e. high traffic / traffic jam) and green means no decrease (i.e. traffic speed close to maximum).

WhereOS uses Spark and Hive as the execution engine for the pipelines. WhereOS pipelines can be created using SQL and R programming languages including built-in functions for ETL (extract, transform, load) operations, statistical analysis, machine learning and artificial intelligence (AI), geographical and geospatial analysis, data visualization etc.

Traffic Jam Video & Further Innovation

The data used for this video has been collected from the vehicles over the time of one full year (2018). You can see how the rush hour traffic affects the average speeds at different locations on different times of the day.

The RoadCloud data can be used for many other analyses such as: How the road conditions – friction, road state (dry/wet/ice/snow), temperature – affect the average speeds. Or how a speed bump or some other new traffic arrangement affects the traffic around it. In the upcoming articles we will dig deeper into the data and create new interesting visualizations out of it.

Ari Tuononen, CEO, RoadCloud:, +358 50 5604 702
More info: Twitter LinkedIn

JP Partanen, CEO, WhereOS:, +358 50 486 9257
More info: Twitter LinkedIn

Most Extensive Visualization of US Income: Over 33000 Zip Codes Visualized

This application visualizes mean income of households in United States, for each zip code (ZCTA, zip code tabulation area): Click here to open the application

Background: Urban Decay

The visualization shows clearly, how large cities in US are surrounded by wealthy zip code areas (red rings). Inner cities, however are areas with less income (green or yellow), with few exceptions, such as New York, Washinton DC and San Francisco.

“In the United States during the 1940s, for the first time a powerful interaction between segregation laws and race differences in terms of socioeconomic status enabled white families to abandon inner cities in favor of suburban living. The result was severe urban decay that, by the 1960s, resulted in crumbling “ghettos“. Prior to national data available in the 1950 US census, a migration pattern of disproportionate numbers of whites moving from cities to suburban communities was easily dismissed as merely anecdotal. Because American urban populations were still substantially growing, a relative decrease in one racial or ethnic component eluded scientific proof to the satisfaction of policy makers. In essence, data on urban population change had not been separated into what are now familiarly identified its ‘components.’ The first data set potentially capable of proving ‘white flight’ was the 1950 census.” Source: Wikipedia

The visualization is based on American Community Survey (ACS) from

Red = high income
Yellow = medium income
Green = low income


I’m looking for bloggers, who want to co-author articles about interesting/surprising facts about US (or European) population demographics, economy etc. along with visualizations. Also, if you have ideas for further visualizations, please contact me via

If you want to embed this app to your web page, read the instructions here.


Finland: 58 Interesting Indicators Visualized

This is an app, through which you can visualize demographic data from Finland. Try also clicking for example “Relative” and “Higher level university degree”, or “Employed”.

Click here to open the application

The data used in this visualization: Paavo – Open data by postal code area, Statistics Finland. The material was downloaded from Statistics Finland’s interface service with the licence CC BY 4.0.

Check this article how to embed a WhereOS app to your own website. If you have ideas what kind of data you’d like to see in the apps, write me an email


Location Analytics Industry Trends

Here are three relevant industry trends, why I believe WhereOS is going to change the world:

  1. Hyperconnected World – Businesses are getting more and more connected to online, cloud based information systems as a part of their business processes. Customers are accessing companies’ online services 24/7 through different interfaces, generating more and more information about their purchase behavior, location, movement, social interactions and so forth – all tied to geographical locations
  2. Business Digitalization – New businesses are challenging traditional industries by using combinations of static and dynamic information to create new services for their digital customers: They know better who their customers are, what they want, where they are located, and how to serve them best. They know where the operational bottlenecks are and can deliver the right goods and services to right locations to create the best customer experience, maximize the sales and minimize the cost.
  3. Data Explosion – The amount of data businesses are collecting from their customers and business operations is increasing constantly. Data sets are large, and in constant change – the existing IT systems and processes do not facilitate innovation in the fast paced change many industries are undergoing.

With WhereOS, it’s extremely easy to built APIs and applications based on data: instead of months, new applications can be create in a matter of hours or days. This facilitates lean prototyping of how data can be used to create new value-adding services and products, helping both traditional and new businesses to speed up the development.

If you are interested in discussing more, please contact me through