Innovating and expanding it is harder. How to (further) monetize on telematics data?
Is it possible for an already lean Fleet Management Organization (FMO) to be even more efficient and to get some new perspective into their business? Is it possible for an established TSP business to make a breakthrough into new areas? If so, how?
A typical Telematics Services Provider (TSP) purely maintains an existing Fleet Management Organizations (FMO) network using proven, standard HW & SW technology. The growth is limited by technological barriers, inflexibility to frequent customer requirements, and constant declining profit margins.
On the other side, to make savvy business decisions, agile FMOs seek more insightful analytics of their data.
All of a sudden, there are lots of new players that didn't exist a few years ago, who also want to make business use of connected car data.
Regardless of the fact that incremental innovations are being continuously made, looming questions nowadays are:
This text aims to dig a little bit more into an emerging idea of how flexible, fast, and valuable Big Moving Data analytics is a major prerequisite for growth and monetization. It will show a few ideas what to do to become a technology and analytics frontrunner in the TSP business.
Only businesses that offer something fundamentally different and distinguishable grow. In the telematics industry, offering answers to unforeseen, perceptive, and complex questions is the new differentiation point.
The vast amount of connected moving objects data, combined with advanced and focused big data analytics skills, is the new business differentiation niche.
Current reporting practices in TSPs are most commonly based on conventional vehicle and driving parameters and, to some extent, their derivatives. Reports are usually provided as-is, as off-the-shelf products, with limited possibility of customizations. Querying data is offered through different sets of APIs, but the TSP data is what limits the derived analytics.
Off-the-shelf reports/analtyics simply do not suit every FMO.
Off-the-shelf driver scoring, driver congregation, and fleet utilization analytics, then handling business and private activities fit only to generalized types of vehicle fleets. Making business decisions based on such analytics is often far from being adequate.
Agile FMOs seek opportunities to improve their operations. To do so, they want to analyze the fleet's behavior in a scientific manner - find hidden anomalies, correlations, and causalities, which obviously cannot be accomplished with generic reporting.
The solution is to offer a set of flexible and powerful analytical tools that can be easily customizable. This eases the never-ending customization requests and enables in-house analysis for the advanced customers.
Smart TSP understands that fleet management is a highly operational, day-to-day job and aims to ease this process to FMOs. FMO's goal is to reduce costs and increase fleet efficiency, and the smart TSP focuses on helping their FMO customers achieving these goals.
Typical FMO requirements are:
Apart from these standard expectations, FMOs want to make more quality use of data. However, finding exactly how and where to improve efficiency and where the excess costs are occurring is challenging.
TSPs should be very flexible in serving these FMO business needs, offer some new products/services, and establish new streams of revenues. As a tech company, the burning question for TSP is How to leverage our technology to meet client's advanced needs and broaden the company's revenue streams?.
A TSP monitoring even few thousands connected vehicles and storing their historical data unequivocally deals with Big Data. Unfortunately, it deals with the data in a very basic manner with standard reporting.
Fleet managers learn that, apart from standard reports, advanced analytics provide more actionable insights into a fleet's daily life. Companies are increasingly making advanced analytics and Big Data part of their daily monitoring and decision processes, and FMOs and TSPs are no exception.
Having all above mentioned in mind, let us pose some questions about specific business areas and processes within them.
Driver and other traffic participants' safety is probably the utmost issue for FMOs.
Benefits of improved driving style occur in other business aspects, such as reduced maintenance and fuel costs. A balanced driving style reduces fuel costs and diminishes the risk of failures and accidents. These are the reasons why fleet managers pursue safety standards and steadily increase their watch over the different vehicle and driving parameters.
Safety analytics is primarily based on monitoring the drivers' behavior. TSP analytics typically provide information on hard cornering, harsh accelerating, hard braking, and agile FMO to translate these insights into driver education programs to raise safety scores. However, accelerometer sensors rely on G-force to classify a particular event as aggressive, and the quality of their readings is susceptible to positioning and required regular calibration.
Fusing this type of information with actual speeding frequency - speeding above posted speed limits - may provide an objective and precise safety evaluation.
On-the-fly monitoring of various driving features (RPM, abrupt start, dangerous cornering, speeding) gives instantaneous insights and enables FMOs to immediately execute corrective educational programs and preventive actions.
Now, what if the data that TSPs provide reveals just part of the picture?
Establishing the correlation and causality between safety-related patterns is something proactive FMOs want to achieve - to find the root cause of a certain, undesirable behavior. To do so, they need to have fast and flexible analytics backend provided by their TSPs.
Few ways to improve safety analytics:
Same as with the safety, the compliance is an important area for the FMO. External/internal compliance is a brand-trust issue and violations are expensive from every perspective, not only financial.
For example, the fleet's driving style is an important ingredient of the perception of the company and the brand itself. The FMO'ss goal is to have consistently well-mannered driving styles across all drivers.
Few ways to improve compliance:
Insurance costs – every year, painful negotiations with insurers. How could fleet management data help our insurance procurement department have more leverage in negotiating insurance prices?
Insurance companies tend to leave behind the old-school demographic-based risk modeling and are turning to drive habits risk modeling. Modern insurers are more frequently offering Pay As You Drive (PAYD) / Pay How You Drive (PHYD) insurance policies as a way to increase efficiency and equity and help achieve policy objectives, including increased traffic safety, consumer affordability, energy conservation, and pollution reduction.
TSP with a flexible and fast backend analytics solution can act as a technology barrier mediator between FMOs and insurers. Having a TSP that enables effortless and accessible analytics of key parameters gives the actuaries more incentive to create pay-how-you-drive insurance policies.
Thereby, a TSP must ensure data privacy and security so that the confidential connected car data is available only as aggregated values and be kept at TSP's premises at all times.
Statistics of the asset utilization, dwell time, downtime, on-time delivery percentage, and stop times provide valuable insights to FMOs because it allows them to spot the opportunities for increasing fleet utilization efficiency and overall productivity.
An innovative TSP should understand these cornerstones. Besides providing the most common analytics, it should provide FMOs with a flexible approach to accessing and analyzing vehicle data in an aggregated fashion.
Having such tools, FMOs can make much more complex and tailor-made analyses of adverse effects of different parameters, e.g., a correlation between fuel economy and on-time delivery. It allows for the deployment of artificial intelligence to predict mechanical issues and preventive repairs.
Few ways to improve productivity analytics:
Costly repairs after vehicle breakdowns are much more expensive and time-consuming than optimally scheduled preventative maintenance. While designing vehicles, OEM's R&D departments conduct numerous tests to find anomalies and avoid their customers' problems (at least in the warranty period).
Why wouldn't FMO's maintenance engineers be able to do the same thing while they monitor the fleet's health? Why wouldn't FMOs use existing data to estimate the probability of certain equipment failing?
Modeling the right predictive maintenance requires monitoring trending problems and historical insights into correlations and causations of driving patterns, sensor data, and real failure events.
TSPs who could provide sophisticated predictive maintenance models or an easy way for FMOs to do it in-house can count on largely improved customer satisfaction ratings, as well as the brand image.
Few ways to improve maintenance analytics:
Fuel consumption is one of the largest factors influencing the Cost-per-Kilometer (CpK), typically making it the most desirable value to reduce.
TSPs typically offer standard fuel analytics, such as mpg averages, but the true power relies on the identification of anomaly patterns. Anomalies include drivers or vehicles that tend to consume too much fuel, and finding the reasons behind such occurrences ultimately reduces FMO's expenses.
Outlier and anomaly detection can be brought to a higher level by a deeper investigation of correlations between driving patterns, such as speeding, accelerating, distribution of drives per road type, city areas, etc. TSPs that provide FMOs tools to quickly and easily test possible correlations and thus make the decisions can truly impact customers' success.
Fuel consumption reduction, for example, can not be limited to examining driving behavior because the fuel consumption also depends on where vehicles everyday routes. For example, customer locations heavily influence FMO's CpK - driving on urban roads typically burns more fuel than driving on highways.
Few advanced fuel analytics:
Speed and simplicity in data handling are the keys for the companies outside of the big-data-analytics-experts circle, and this is what standard-reports/analytics are for. On the other hand, to see how one of your teams compare to another on the same type of tasks or how one vehicle brand compares to another while driving in similar circumstances gives the most precious internal fleet insights to fleet managers. On top, to see your fleet performance compared to the average performances of other FMOs with similar characteristics is the cream on top of the cake.
Benchmarking on a large-scale is rather difficult and requires a highly advanced analytics backend for TSP to have.
But, being able to draw conclusions from such inquiries gives an extraordinary opportunity to offer differentiation from TSP side, as well as for fleet operators to have truthful and up-to-date reality-check on particular fleet issues.
What to benchmark?
Advanced analytics doesn't help only with standard internal/external comparisons like consumption, idling times, utilization, and safety parameters. It enables much more complex things, like measuring the influence of the fleet performance indicators on the brand image. Company and brand values can translate into a set of desirable and measurable parameters that represent Fleet Profile.
For instance, a fleet profile might combine KPIs for fuel consumption, utilization percentage, on-time deliveries percentage, and fleet's speed-profile-uniformity-measure. Any parameter combination that the company choose and believe represents its vision and values. Having these parameters monitored and confronted with some other values (e.g., third party complaints to the fleet driving behavior) enables companies to clearly see how fleet profile affects the brand image.
Again, the execution of similar innovative approaches quickly and straightforwardly on massive datasets is not a trivial task. But it represents a new and tangible opportunity for TSP to innovate and amaze its clients with some genuinely new offer.
Other than the usual TSP's client, it seems that there are some new prospects on the horizon, and they might make use of the data TSP possesses.
TSPs with heterogeneous fleets can offer insightful information about the behavior and driving patterns of the general traffic.
City authorities are often claiming that one of their first priorities is improving a city's transportation efficiency.
Strong public request to the city officials is focused towards much more efficient managing of peak-hours congestions, optimizing traffic light intervals, and reducing average journey length, bringing CO2 levels down.
Traffic regulators strive to better understand typical-day driving patterns, types of roads used, how much time is spent waiting in front of a traffic light, when vehicles are stuck in traffic etc.
What happens with traffic during chaotical times of some special events (sporting events, concerts, serious traffic accidents)? Traffic regulators would like to optimize in real-time, provide info on possible routes, reduce travel delays in real-time, and optimize smart traffic light systems to reduce current congestion. For all this, the relevant data source is critical.
And this is where the stable real-time flow of fleet data, accompanied by historical data and excellent analytical possibilities, gives a unique opportunity to make real usage for the smart city and additional revenue for TSP.
Closely associated with traffic flow data that TSPs already possesses are billboard advertising optimization and geomarketing. By understanding traffic patterns, advertisers can deliver highly relevant content through strategically positioned billboards of curated content. The benefits are increased when such geomarketing is combined with matching online marketing, e.g., to increase in-store visits.
Tracking origin-destination matrices, daily rural-city migrations, or tourist migrations, advertisers can achieve better exposure of marketing billboards to the drivers and passengers, measured through number and type of vehicles, time slots, or average passing-by speeds. Then, there is also the possibility of the real-time adjustments of marketing ads.
Retailer centers typically want to know the origin-destination matrices of their own and competitors' retail site positions. The ability to profile the potential customers' moving patterns and behaviors is essential for site planning.
Equally vital for retail center placement screening and for managing existing retail centers is customer identification in terms of their residential areas and transport routes.
To start, a better understanding of travel patterns in surrounding areas is valuable information for targeted billboard placement and social media campaigns. For example, the identification of the locations from which customers do not come is equally important. In order to increase in-site visits and gain new customers, retail centers can utilize this information to precisely target advertisement operations towards certain geographical areas, especially during seasonal sales. Moreover, awareness of specific visiting times during the day, weekdays, and holidays can facilitate balancing customer targeting.
Furthermore, per Deloitte's 2020 The future of the mall study, the foot traffic in retail centers had been declining even before the coronavirus outbreak, and consequentially many retailers are re-evaluating the profitability of brick-and-mortar stores. Without being limited to the finance reports only, understanding the nearby areas' potential can help during the culling of lower-performing stores.
These are industries that might be interested in aggregating TSP's enriched data insights with other data providers in order to package and resell them.
The number one revenue stream of larger TSPs are live-traffic and base map generation programs. TSPs forward telematics data to map data providers' servers, which are then later used to enrich, correct, and update base map data, as well as to generate valuable live-traffic information. Unfortunately, as many TSPs have witnessed, this option is not that well priced.
Vehicle Data Hubs (VHD) or platforms for exchanging connected car data with interested third parties are the next prospects for the raw telematics data. Instead of individually negotiating with potential customers, reselling data through VHDs facilitates the data transfer and financially more stable income source. The better the VDH in their marketing and VDH's analytics tools, the bigger the resale value.
What are the main obstacles FMOs face when deriving valuable business decisions from telematics data and how TSPs can help empower FMOs in gaining insights from the raw telematics data?
The volume of data received from GPS tracking units snowballing with time is the reason why the telematics data, fused with numerous vehicle-integrated sensors, has always been the biggest of Big Data. And deriving insights and revealing disguised information from such specific data is, in the full sense, data science.
An agile FMO trying to include telematics data-driven decisions, purely to cope with the complexity of work required, faces the need to establish an entire data science IT department. Why? Because the extent of work, skills, and capabilities surpasses the one single data scientist's capacities.
Typically, the data science workflow consists of:
Data science workflow necessarily includes a variety of people to support data-driven business decision-making because every step of the workflow requires a variety of disciplines and skills. The Big Data analytics organizational unit therefore includes:
On top of those requirements, the demand for data scientists skyrockets on a year-to-year basis, as a growing number of businesses more and more heavily relies on data-driven insights. Per LinkedIn 2021 Jobs Report, the hiring for data science roles grew 46% since 2019, making it even more challenging for FMOs to set up data science teams.
However, the challenging task of building data science teams can be dramatically simplified by automating procedures and introducing maximally simplified work tools. Moreover, ideally, provided that the work tolls are excellent, the need for IT, DevOps and database engineers can be completely eliminated.
The natural course of action would be to eliminate the need for extensive IT support by carefully selecting the right working tools. However, the selection of well-suited technologies on the market is very limited.
The second enormous obstacle that partially solves the previous one is the choice of technology. FMOs face the need to select the appropriate tools to analyze the telematics data, which happens to be a highly non-trivial task.
Unfortunately, when it comes to extra-large telematics datasets, none of the Big Data tools on the market can cope with it.
Features inherent in telematics data (like extremely skewed spatial data distribution) are the reason why even at low volumes of data, the time-performance of today's Big Data tools degrades to unusability.
To support these statements, let's quote J. Andrew Rogers, a pioneer in ultra-scale geospatial computing.
Geospatial databases, at a basic computer science and implementation level, are unrelated to more conventional databases. The surface similarities hide myriad design challenges that are specific to spatial data models. All of the architectural differences below are lessons I learned the hard way, manifested as critical defects in real-world applications. You can think of it as a checklist for “is my geospatial database going to fail me at an inconvenient moment”. J. Andrew Rogers
Most geospatial databases were built for creating maps. As in, geospatial data models that can be rendered as image tiles or paper products. Mapping databases evolved in an environment where the data sets were small, rarely changed, and production of a finished output could take days to complete. J. Andrew Rogers
Modern spatial applications are increasingly built around real-time spatial analysis and contextualization of data from IoT, sensor networks, and mobile platforms. These workloads look nothing like making maps. In fact, these workloads are unlike any workload studied in database literature. There is no existing design to copy that can support this use case. A database engine optimized for a modern geospatial workload must be designed and implemented from first principles, which requires unusual levels of skill. J. Andrew Rogers
Although Rogers wrote the above in 2015, the reality didn't change much since then, and most high-profile Big Data tools are still worthless in telematics use-case.
The majority of off-the-shelf tools provide a click-and-wait experience for analysts as the response times vary from minutes to hours, even for relatively simple inquiries on relatively small datasets.
Consequentially, analysts perceive them as deficient and inadequate for systematic investigation and gaining insights. The slow speed of mainstream Big Data tools effectively prevents scientists from working at their curiosity speed, deteriorates productivity, and leaves plentiful opportunities unexplored.
GeoMesa, GeoSpark, Amazon Redshift, Oracle Exadata, Hive, Databricks, and Snowflake are some of the most well-known Big Data tools with full or partial geospatial support. GeoMesa, GeoSpark, Hive, Databricks, and Snowflake are all Hadoop-based, meaning they suffer from the same issues and are per noted 2017 Data Reply consulting company's study proven to be incredibly slow and ineffective in spatio-temporal analyses.
None of the abovementioned mainstream databases can deliver even tolerable performance when the data size exceeds a couple of billion records.
To be precise, mainstream Big Data tools are destined to break apart when data size starts exceeding dozens of billions of data records. To put into the telematics perspective, dozen of billion records equal to 1-year worth data of 100,000 vehicles. Namely, the query performance degrades exponentially as the data size increases, while the telematics backend acquires new vehicle data over time.
At a 20 billion records mark, all mainstream tools effectively stop working and cannot be fixed.
Currently, only two companies are trying to solve the issue of analyzing highly non-uniform geospatial data in some manner and somewhat successfully.
Both of them come with serious drawbacks when handling telematics data. Their inherent design flaws of handling accentuated non-uniform telematics data are compensated with the immense amount of usually expensive hardware resources.
Google BigQuery, the first one, is a fully-managed cloud-based solution, with the support for spatial data only since 2018.
But the underlying technology they use is incredibly inadequate at handling telematics data using moderate hardware resources. Why?
Vehicles' tracking data is often located in urban roads and less frequently in rural ones, which is why the data is so non-uniformly distributed. Some roads are overwhelmed with vehicles, while the other ones could be almost empty over time.
Google BigQuery uses the so-called Geohash geocoding system to index spatial data. Due to the high skewness of the vehicles' data locations, Geohash indexing quickly degenerates to a useless level. This is exactly the same reason why Mireo SpaceTime outperforms GeoMesa by 17x even on a small telematics dataset (benchmarking independently conducted by Ericsson Nikola Tesla d.d.; complete report is available upon request).
To mitigate the problem, BigQuery may opt to distribute the data across a large number of nodes based on data records timestamps. This is the fundamental reason why Google BigQuery requires absurd amounts of hardware resources to handle telematics data.
Alongside the resources, the intricate pricing model makes it practically impossible to estimate the service pricing.
However, to facilitate the interaction with such complex technology, Google recognized the importance of having a standardized interface, such as a widely spread SQL-like interface.
One of the well-known BigQuery telematics customers is certainly Canada-based company Geotab. Geotab is the utmost example of how providing off-the-shelf data-driven analytics at speed can influence business expansion.
To manage its fast growth caused by sudden tripling the number of connected vehicles from 400,000 in 2016 to 1,4 million in 2018, Geotab has partnered with Google BigQuery.
The partnership facilitated Geotab to focus on the development instead of platform maintenance, simultaneously creating a competitive edge by providing its customers data-driven analytics.
With Google BigQuery, Geotab enabled its customers to gain relevant insights from raw vehicle data in an easy-to-use and fast manner. Google BigQuery's standard interface (SQL-like syntax) has proven to be the biggest benefit given how widely analysts are familiar and experienced with it. The speed of accessing the data enabled its customers to rely on real-time data insights.
Moreover, having such a vast amount of data and the ability to create valuable insights on a large-scale, Geotab has managed to venture into smart city initiatives, thus creating a new revenue stream.
However, to support data processing from over 2 million vehicles, Geotab's services are hosted across over 1,200 virtual machines, with a weekly increase rate of 3-4 servers.
The second Big Data spatial database provider, the USA-base Omnisci, is also one of the forerunners amongst accelerated analytics providers. Instead of relying on traditional CPUs to achieve performance, OmniSciDB leverages graphic cards' processing powers (GPUs).
The OmniSci designed its platform specifically to overcome mainstream analytics tools' limits by redefining speed and scale in big data querying.
With Accelerated analytics being their slogan, OmniSci understands that lag times in spatiotemporal Big Data interaction frustrate analysts and data scientists. The slow speeds of mainstream tools harm analysts' productivity, while decision-makers, more often than not, receive outdated, incomplete, and imprecise insights.
However, leveraging the processing power of graphic cards is equally OmniSciDB's biggest strength as well as the disadvantage.
The entire analyzed datasets need to be processed inside the graphic card's memory, which is particularly unsuitable and extremely expensive for bigger datasets, especially telematics data.
To put into the telematics perspective, 64 servers with integrated graphic cards, commercial value $15,000 each, should be used to store 1-year worth of data from 100,000 vehicles.
Although OmniSciDB platform requires somewhat more moderate resources compared to Google BigQuery, the OmniSci's technology in the telematics industry is equally inefficient and financially expensive.
Mireo SpaceTime is an analytical add-on for processing and storing telematics data that works in parallel with the existing TSP backend, focusing on providing extensive but simple tools for the analysis of the vehicle behavior.
Mireo SpaceTime aims to break the barriers that virtually all Telematics Service Providers are facing when trying to grow their business, both vertically and horizontally. TSPs are sitting on an enormous pile of telematics vehicle data, and any attempt to monetize them is by no means an easy task.
The cornerstone of Mireo SpaceTime is the ultra-fast database and query engine, designed and implemented specifically for the telematics data analysis. Mireo SpaceTime exposes a simple SQL interface for accessing (historical) telematics data. SpaceTime implements a standard ANSI SQL with few extensions specific to the telematics data environment. Notably, extensions are designed to simplify spatial operations and access to map data.