Honesty in Prediction Models: Let’s Be Honest

Prediction can be a hard problem to solve and, oftentimes, even un-solvable. When discussing honesty in prediction models, we need to recognise the challenges of modelling complex systems. How many times have we seen articles about some financial market, be it housing, stock or crypto predicting some behaviour, and yet the opposite happens. For the sports fans out there, it is clear that no matter how much you watch a sport or how much information you have gathered and taken into account, surprises still happen. Just think of Greece’s victory over Portugal at the 2004 Euro’s or Paul the Octopus’s uncanny football predictions!

Paul the Octopus

That is because life happens, the world is full of chaos and unless we are modelling some physical laws of nature it’s pretty hard to predict. This chaotic nature of the world makes honesty in prediction models even more critical. It’s important to acknowledge that forecasting isn’t always precise, and we should embrace transparency in the process. This is both a bit scary but also very exciting. It is the reason I moved from a purely mechanical focused career where you could calculate the right answer, to something a bit more open ended. It means that if you can predict something of value, it is worth doing, and it can set you apart. But it is hard.

I am not going to go into how to predict things, or even why somethings are easier to model than others, there are already plenty of very good books on that subject. But I am briefly going to focus on why it is important to be honest with ourselves, that this can be a tricky process, there isn’t always enough signal, sometimes things don’t work, but all of that is okay. And what I think is important is cultivating an environment where we are honest about our ability, for it to be okay for things to not work and to share our failings along with our successes, the more we are comfortable with these aspects I believe the better things can be.

Belief in Possibility: The Attitude of “Yes, We Can”

I grew up in South Asia and one of my favourite cultural observations when compared to the UK was attitude to repair. If you had a device that had broken, and you took it a shop and asked about repair, the answer was almost always “yes, no problem”, almost always things could be repaired. On some occasions perhaps things wouldn’t quite come back the same, but in the UK, it feels like the response is almost always “Sorry, it looks like a write off, better to just get a new one”. Now I am not going to comment on attitudes to disposability or access to goods etc. I am aware there are many factors at play. But it feels like that the response was always one of belief and acceptance, “Yes we can do it.”

I love this, but it certainly has its place. I have had the fortunate experience to work under some real believers, this certainly provides many positives, however it did mean that anytime someone asked if our team could deliver something the answer would almost always be, yes (think Jim Careys Carl from yes man), a behaviour I am certainly guilty of myself.

Honesty in Prediction Models: saying yes

I think there was however a very noticeable difference between making build promises from an engineering capability vs analytical perspective. Saying yes to building a new endpoint or platform functionality is very different to saying yes to building a model that is X% accurate or delivers £X value. From an engineering perspective I am comfortable with the idea of what I can and cannot build, sure there will always bumps along the way, but can it be built? I feel I can answer that. But in the analytical space it is a whole other story, it is why you see stories like Even After $100 Billion, Self-Driving Cars Are Going Nowhere, and issues like stops signs to the right, building predictive models is considered experimentation for a reason.

Honesty in Prediction Models: self driving cars

We don’t know what results we can get; we may be able to get a good idea, but promising beforehand or working to specific target metrics without exploring and iterating is not ideal, and there can be so many unknown factors and outliers at play. So we should be open about this, say “yes there are possibilities and things we can explore”, but we need to iterate, try things out, only then can we start thinking about deliverables and saying what we will achieve, even then delivery needs to be structured in a way that allows for experimentation (as briefly discussed below).

This process and mind set should be a good thing, after-all it leads to less broken promises, less stress & anxiety and fewer moments for your data scientists ending up working late in the night asking themselves ‘why won’t some model just work’.

That’s why, at Predyktable, we prioritise fostering an honest dialogue about what’s achievable. We explore, iterate, and refine our approaches, recognising that the path to innovation is marked by valuable learning moments. By embracing experimentation, we minimise the pressure of over-promising and under-delivering. Being transparent about our limitations and effectively managing expectations is crucial to upholding honesty in prediction models. This approach ensures that all stakeholders are aligned with realistic outcomes rather than being misled by overly optimistic projections.

It’s OK to Stop: Avoid the ‘Pot Committed’ Trap

So sometimes it does happen, you’ve built it up to your stakeholders, spent ages learning new frameworks, got more data, burned through resource hours and you are still not getting results. For those that don’t play poker, being pot committed is the act of having “put in so many chips”, or otherwise risk of consequence, that you might as well follow through with the plan. But this should never be the case, you shouldn’t be afraid to stop, and it should be acceptable.

Earlier in my career whilst working in Insurance I had been supporting some analysis for price elasticity, we had conducted price tests to see how consumers would react to higher and lower prices i.e. would we get more demand? could we generate more margin? We spent a reasonable time on the piece, but our result was that we don’t have enough data, and we will probably never have enough data given our position in the market. We had to present these results at a steering committee meeting to the senior leadership team of the company including the CEO. Obviously earlier in your career (and perhaps at any stage) this can be quite intimidating; “Uhh we can’t really tell anything; all of this has been a bit of a waste of time” is not the ideal message. But I remember how accepting the committee were and how supportive my manager was in delivering the message. It was very formative of my outlook in being confident in the truth, owning it and creating a space for it around me with the people that I work with and those that I manage.

I think perhaps a key take away is how to recognise that point earlier within the work we are doing, to stop wasting resources down fruitless rabbit holes. It isn’t entirely avoidable, we humans like our challenges. There are lots of frameworks to help this, but the key for me is small quick iterations and evaluations, how much improvement are we seeing for the work we are putting in, we know there is going to be diminishing returns somewhere and identifying that earlier is gold.

The Importance of Honesty in Prediction Models: Embracing the Value of Experimentation

If there is such a degree of uncertainty about data science, you may ask, how are we meant to plan and deliver effectively. There are many different models that companies have employed see Models for integrating data science teams within organizations for some additional reading. However I went to a great talk by Nick Jakobi in one of the Pydata Meetups where he talked about delivery within agile frame works. He spoke about the idea that experimentation as a task had the aim of answering a question. Can we predict this, does this show this etc. The key thing was that whatever the answer the results of the experiment was still delivery of information, and that information was inherently valuable and should be deemed so.

For example, let’s say you are working on improving a forecasting model, and after spending a sprints worth of work on it, you simply cannot make any significant improvements to the performance of a model. In this case you have learned that very fact that given the provided effort model improvements are not feasible, at this point in time. And that learning is valuable, I believe they used a knowledge base to enter the result of each learning and task so that that it was easy to access and allowed them to learn globally about what works and what doesn’t. At Predyktable, we view every learning as an important step forward, and we ensure these insights are captured and shared to benefit the entire team.

Show me your code

By being okay with failure and cultivating a sense of honesty and trust we can alleviate so many issues before they arise, but again this is difficult. Most will have seen Elon Musk asking all his engineers to show him their ‘most salient lines of code’ as he goes about his firing spree. I am sure this would evoke a bit of fear in those who wish to stay.

As with many industries there is an epidemic of imposter syndrome within tech this can breed a reluctance to show and share code and low level results, especially with the analytical domain. But again, cultivating the ability to be honest and open for challenge and review, (what we would expect in academia) and just reach out about where we are struggling also helps to alleviate this and can mitigate against this.

You will still see some scenarios like what we saw with the covid modelling by the Imperial University, where external validation was not able to take place initially causing some concern or models like the re-offending prediction used in the US where predictions are used that impact people sentencing but there is a strong reluctance to show process of getting there.

So instead of imitating Col Jessup from A Few Good Men, and maintaining ‘you can’t handle the truth’, we should be leaning towards Fletcher Reed from Liar Liar (again Jim Carey), in building an environment where we can be honest with each other, which should really be beneficial for all involved.

Honesty in Prediction Models: you can't handle the truth

Creating a Culture of Honesty and Trust

In tech, imposter syndrome is rampant, and the reluctance to share code or low-level results can be real. But at Predyktable, we cultivate a culture where openness and honesty are valued. Sharing both successes and struggles, seeking feedback, and validating models externally are all critical to building trust within the team and with our clients.

We don’t believe in hiding behind processes or pretending everything is perfect. Instead, we embrace the mindset that honesty—about both challenges and achievements—benefits everyone involved.

At Predyktable, we value transparency and embrace honesty in prediction models. By fostering this culture, we deliver better results for our clients while creating a supportive and collaborative work environment.

Reach out to speak to our experts today.

Tapping into the Power of Text: Using LLMs to Improve Field Service Operations with a Predictive Maintenance Model

As a data science startup, we were thrilled to partner with a major services provider in the food and beverage (F&B) industry. Our mission? To build a predictive maintenance model using LLMs that could accurately forecast equipment failures and maintenance requirements in the near future – allowing the service provider to streamline their operations. Initially, we focused on leveraging the structured data at our disposal. We crafted a classical machine learning model, carefully engineering features from historical maintenance records, equipment specifications, and external factors like weather, regional events, and their impact on F&B consumption. While this model had great performance metrics, we were determined to push the boundaries further. Our attention turned to the treasure trove of unstructured text data nestled within service technicians’ notes to enhance our predictive maintenance model using LLMs. Could these notes hold the key to unlocking even greater predictive power?

That’s when Large Language Models (LLMs) entered the scene. By now, we’ve all interacted with LLMs in some capacity. We have witnessed their tremendous prowess at generating new text, images and even videos. However, LLMs extend far beyond just text generation; they play a crucial role in building predictive models by generating text embeddings. 

LLMs are deep learning algorithms trained on massive datasets of text and code, allowing them to understand and generate human-quality text. A key capability of LLMs is their ability to generate “embeddings,” which are essentially numerical representations of words, sentences, or even entire paragraphs. Imagine you have two sentences: “Taylor Swift’s concert was a resounding success” and “Arsenal dominate Chelsea in a five-star performance”. While these sentences differ greatly in content, LLMs can convert them into numerical vectors (embeddings) that capture their underlying meaning. These embeddings would highlight the “positive sentiment” shared by both sentences, albeit in different contexts—one musical, the other athletic.

These embeddings become incredibly powerful tools for analysing unstructured text data, identifying patterns, and uncovering hidden relationships. Since the embeddings are just numbers, they can be fed into classic machine learning models like any other features and could allow the model to derive insights from the text data.

Classical Approach: Text Embeddings and Their Limitations in Predictive Maintenance Model Using LLMs

Our initial approach involved using pre-trained LLMs to generate embeddings for the client’s service notes. These embeddings, often hundreds of numbers long, captured semantic information about the text. We then fed these embeddings into a predictive model along with other relevant features to estimate the likelihood of high maintenance needs at each location.

predictive maintenance model using LLMs, Classical Approach: Text Embeddings.

While conceptually sound, this approach faced challenges:

  • Large Embedding Size: The high dimensionality of the embeddings increased model complexity and computational costs.
  • Not Specific to Client Data: Pre-trained LLM embeddings are optimised for general language understanding and might not accurately capture the nuances specific to our client’s industry and operational context. In the above example, the embeddings haven’t been specifically trained to relate text to high maintenance needs in the F&B industry.

Enter Fine-Tuning: Tailoring LLMs for Specific Tasks in Our Predictive Maintenance Model Using LLMs

Fine-tuning offers a solution to the above challenges by further training an already powerful LLM on a specific dataset and task, such as classifying F&B service notes. The pre-trained LLM is trained using the client’s specific data. In this case, it’s the service technicians’ and agents’ text notes along with their corresponding maintenance outcomes (how many hours of maintenance tasks are needed in the future). This training aligns the model’s understanding of language directly with the client’s terminologies and context.

Finetuning BERT diagram- easy LLM

Challenges with the Client Data

All Large Language Models (LLMs) have a limit on the number of words (tokens) they can process as input. Larger LLMs, like Google Palm, can handle more tokens but are harder to fine-tune due to their size. Smaller LLMs, like BERT, can process fewer tokens but are easier to fine-tune with new data.

The client data we were working with was enormous—it included all service provider notes from across a geographical area for the past week. The following image shows a sample of this data:

Sample Client Text Data

This data presented us with two challenges:

  • The input data size was quite large and could not directly be used as an input to easy-to-finetune LLMs like BERT.
  • The data seemed quite gibberish and it seemed quite improbable that directly finetuning an LLM using this data would have any value. 

In the following sections, we will explain the two-stage process used to address these issues.

A Two-Step Solution: Summarisation and Fine-Tuning with BERT

Step 1: Summarisation: We utilised a large LLM like Google Palm to summarise the lengthy and detailed service notes into concise, information-rich summaries. These summaries focused on extracting the total number of maintenance issues and different types of issues faced by the outlets in the area, significantly reducing the text volume without sacrificing crucial information.

LLM Summarisation

Step 2: Fine-Tuning BERT: Summarisation created a succinct view of the different types of issues faced in a particular area in the past week. This text was extremely relevant for predicting the expected number of maintenance requests in the future. The second step involved capturing this dependency by finetuning a BERT model. BERT (Bidirectional Encoder Representations from Transformers) is a versatile and powerful language model that is relatively easy to train on new data due to its small size. 

The overall process is captured in the following diagram:

predictive maintenance model using LLMs: BERT (Bidirectional Encoder Representations from Transformers) Final Pipeline

Promising Results

We observed a strong correlation of 0.65 between the P(High Maintenance Needs) score generated by BERT and the actual number of maintenance visits at each location. This indicated that the BERT model, after fine-tuning, was successfully learning from the client’s service note data and translating it into meaningful predictions. The plot below illustrates the reduction in both training and validation loss over epochs (iterations) during the fine-tuning process, highlighting the effectiveness of our approach in making the BERT model better at its predictive task.

LLM correlation BERT Model

This solution offered a more precise and efficient approach for leveraging unstructured text data. By fine-tuning BERT, we successfully bridged the gap between general language understanding and our client’s specific business context. This project highlights how a predictive maintenance model using LLMs can effectively leverage unstructured data sources, like free-text call notes, can be harnessed to improve field service operations and deliver a superior customer experience.

Learn more about how a predictive maintenance model using LLMs can transform your business operationsContact us today!

Predictive Analytics: Sharpening Demand Planning in the Supply Chain

The contemporary supply chain landscape faces a multitude of challenges. Disruptions like labour shortages and geopolitical tensions create obstacles in delivering products efficiently. Accurate demand planning in the supply chain is crucial in this environment, allowing businesses to optimise inventory levels, prevent stock-outs, and adapt to market fluctuations. Predictive analytics takes this a step further by leveraging data and advanced algorithms to generate more precise forecasts.

Accurate demand planning in the supply chain with Predyktable

Traditional vs. Predictive Demand Planning in the Supply Chain:

  • Data Sources: Traditional methods rely primarily on historical sales data and basic statistical models. Predictive analytics incorporates a broader range of data sources, including:

Internal sales history

World economics 

Local and global events 

Weather and seasonality 

  • Forecast Accuracy & Horizon: By analysing a richer data set, predictive analytics generates more nuanced forecasts with greater accuracy. Additionally, it can extend the forecasting horizon, enabling businesses to plan further into the future.
  • Risk Management & Opportunity Identification: Predictive analytics can uncover hidden trends and potential disruptions in the data that traditional methods might miss. This allows for proactive risk mitigation and the identification of new sales opportunities.
  • Scenario Planning & Decision-Making: Predictive models can be used to simulate various scenarios, such as the impact of a marketing campaign or competitor actions on inventory needs. This data-driven approach empowers businesses to make informed decisions about resource allocation, production planning, and inventory management.
  • Automation & Efficiency: Predictive analytics can automate repetitive tasks associated with demand planning, freeing up human planners to focus on strategic initiatives.
Predictive Demand Planning in the Supply Chain

The Benefits of Implementing Demand Planning in the Supply Chain:

  • Enhanced Forecast Accuracy: Improved forecasts lead to better inventory management, reduced stock-outs, and increased customer satisfaction.
  • Proactive Risk Management: Early identification of potential disruptions allows for the implementation of contingency plans, minimising negative impacts.
  • Data-Driven Decision Making: Businesses can make strategic choices based on a deeper understanding of customer behavior and market trends.
  • Improved Supply Chain Efficiency: Streamlined inventory management and proactive risk mitigation lead to a more efficient and resilient supply chain.

Conclusion:

Predictive analytics is not a silver bullet, but a powerful tool that can significantly improve demand planning accuracy and supply chain efficiency. By leveraging a wider range of data sources and sophisticated algorithms, businesses can gain a deeper understanding of customer behavior and market dynamics. This empowers them to make data-driven decisions, optimise operations, and gain a competitive edge in today’s challenging supply chain environment.

Enhancing Demand Forecasting with Predictive Machine Learning

Introduction

In a fast-paced and ever-evolving world, accurate demand forecasting is crucial for success. Without it, companies risk stockout, overstocking, inefficient allocation of resources, and ineffective marketing strategies. Traditional methods of demand forecasting, while effective to some extent, are often plagued by inaccuracies and inefficiencies. Enter Demand Forecasting with Predictive Machine Learning (ML), a technology that is revolutionising the way businesses predict and manage demand.

In this blog, we’ll explore how predictive ML models improve demand forecasting and its potential to drive better decision-making in inventory management, and marketing campaigns.

Use Predictive Analytics to Improve demand forecasting accuracy by up to 20%, leading to a 10% increase in profits.

The Challenges of Traditional Forecasting

Traditional demand forecasting methods often rely on historical data and statistical models. While these methods have been useful, they have limitations:

  • Lack of Real-Time Adaptability: Traditional methods may struggle to adapt to rapidly changing market conditions, making them less effective in dynamic industries. In today’s fast-paced business environment, market conditions can shift in the blink of an eye. Traditional forecasting techniques, rooted in historical data, often lack the agility to respond to these real-time and rapid changes.
  • Complex Interactions: Demand is influenced by a myriad of variables, including economic trends, consumer behavior, competitive landscapes, and more. Traditional forecasting models often oversimplify or fail to consider the complex interplay between these factors, resulting in less accurate predictions.
  • Data Volume and Complexity: Traditional methods are less capable of handling vast amounts of data and complex patterns, which are increasingly prevalent in today’s business landscape. The digital age has ushered in an era of big data, where an abundance of information is available for analysis. Traditional forecasting techniques fall short when dealing with the sheer volume and complexity of this data, leading to suboptimal predictions.
  • Lack of External Data: Traditional methods typically rely heavily on internal historical data. They often lack the capability to integrate external data sources, such as social trends, weather conditions, national sentiment, and economic indicators, which can be instrumental in refining demand forecasts. External data sources can provide critical context and insight into changing consumer preferences and market dynamics.

Predictive machine learning (ML) overcomes these limitations by incorporating external data sources, adapting to real-time changes, and identifying intricate patterns that might escape traditional forecasting methods. This makes predictive ML a powerful tool for businesses seeking to enhance their demand forecasting accuracy in the face of an ever-evolving market landscape.

Demand Forecasting with Predictive Machine Learning

How Predictive ML Enhances Demand Forecasting

Demand Forecasting with Predictive Machine Learning leverages advanced algorithms, vast datasets, and computing power to improve demand forecasting in the following ways:

  • Data Integration: ML can incorporate diverse data sources, such as social trends, weather conditions, and economic indicators, into demand forecasting models. This allows businesses to gain a more holistic understanding of factors affecting demand beyond their internal sphere of influence.
  • Real-time Analysis: ML models continuously analyse data in real-time, enabling businesses to react swiftly to changes in demand patterns and market conditions.
  • Pattern Recognition: ML excels at recognising complex patterns and correlations in data that may go unnoticed by traditional methods. This means businesses can make more accurate predictions.
  • Forecast Accuracy: By providing more accurate demand forecasts, predictive ML helps businesses reduce excess inventory and minimise stockout. This, in turn, reduces carrying costs and boosts customer satisfaction. Using ML to drive your demand forecasting can see up to a 20% increase in accuracy. Leading on average to a 10% increase in revenue. 
  • Scenario Analysis: ML can simulate different scenarios, helping businesses make informed decisions about inventory levels, pricing strategies, and production schedules before actually committing the resources to the change.
  • External Data Integration: Predictive ML has the capacity to bring in relevant external data sources, enriching the forecasting process. These data sources may include social media sentiment, economic indicators, and even competitor activities. This external data provides valuable context, enabling businesses to align their strategies with real-world events and consumer sentiments, ultimately leading to more precise forecasts.
  • Personalised Marketing: ML segments customers into micro-markets and tailor marketing campaigns to specific customer groups. This results in more effective marketing efforts and improved customer engagement. This allows businesses to better understand where to find their customers, what to offer them, when to offer it, and how to talk to them. 
Demand Forecasting with Predictive Machine Learning

Conclusion:

The inclusion of external data in predictive ML models opens up a world of possibilities for businesses, allowing them to tap into real-time trends and market dynamics that can significantly impact demand. As a result, predictive ML not only provides more accurate forecasts but also equips businesses with the knowledge to proactively adapt to changing conditions and stay one step ahead of the competition.

Contact Us to find out how Demand Forecasting with Predictive Machine Learning can help you!

Insights Report: Leveraging National Sentiment and Predictive Technologies in Marketing

Download our latest insights report, stemming from an in-depth Marketing Survey. Gain a greater understanding of the future landscape of predictive analytics, and its profound implications for businesses in an increasingly data-centric world.

Within this document, we unveil valuable findings obtained through a survey focused on the potential advantages of utilising software capable of accurately forecasting how the national mood impacts customer buying patterns. We garnered insights from more than 100 marketing professionals employed by well-established retail and hospitality brands to assess the level of interest and perceived usefulness associated
with this predictive technology.

Leveraging Large Language Models for Enhanced Contextual Understanding at Predyktable

1- Introduction:

In an ever-evolving world, Predyktable acknowledges the dynamic nature of our surroundings and its profound influence on consumer-business interactions. To navigate these changes effectively, we gather data from diverse sources, encompassing both structured data (e.g. weather and financial indices) and unstructured data (e.g. text and images) and input them into our data pipeline.

Structured data offers a straightforward modelling process, characterised by organisation and logic. For instance, it’s simple to assert that 20 degrees is warmer than 18 degrees. In contrast, unstructured data poses a challenge due to its semantic richness. Defining whether red is superior to green or quantifying the distinctions between Rock and Pop music in a numeric fashion can be intricate tasks.

Predyktable's Large Language Model

2- The Role of Large Language Models (LLMs):

Large Language Models (LLMs) represent a category of artificial intelligence systems endowed with the ability to comprehend and generate human language. These models are meticulously trained on vast datasets comprising text and code, enabling them to grasp the subtleties of human language.

Although LLMs’ primary function is to generate information, in the form of chat or code generation, to do so it facilitates the conversion of contextual data into a numeric format that seamlessly integrates into predictive pipelines. For instance, using an LLM, we can encapsulate the disparities between a Taylor Swift concert and a Metallica concert. The LLM, with its linguistic prowess, has learnt that these events attract distinct audiences and can translate this understanding into numeric representations for more robust modelling.

3- Understanding Large Language Models’ Functionality:

LLMs operate by converting textual information into numerical values, subsequently subjecting these values to algorithmic computations—a process commonly referred to as tokenisation. Once tokenised, the LLM leverages its language proficiency to derive the meaning from the text.

For instance, when presented with the sentence “Taylor Swift is a pop singer,” the LLM dissects it, recognising Taylor Swift as a person, a singer, and an artist in the pop genre. It also comprehends the intricate relationships among these concepts. But in reality we don’t need to tell it who Taylor Swift is or how related she is to Kanye, it has already learned this information and can use this to tell us.

Tokenisation

4- Advantages of Harnessing Large Language Models for Contextual Data Encoding:

Several advantages emerge from using LLMs to encode contextual data including:

  1. Complex Relationship Capture: LLMs adeptly capture intricate relationships between diverse concepts.
  2. Handling Unquantifiable Data: LLMs empower the representation of challenging-to-quantify data, like distinctions between different event types.

5- A real-world example:

To illustrate how Predyktable employs LLMs for contextual data encoding, consider this scenario:

Imagine Predyktable is partnering with a high-end women’s clothing retailer located in bustling urban areas. The retailer specialises in a wide range of women’s fashion, catering to diverse tastes and preferences. Their objective is to gain a comprehensive understanding of how various events occurring in their target market, influence their sales trends. To achieve this, Predyktable harnesses the power of LLMs proficient in language understanding. Here’s how the process unfolds:

Event Data Encoding: Predyktable starts by collecting data on upcoming events relevant to the retailer’s market. These events could encompass a wide spectrum, including fashion shows, cultural festivals, music concerts, and sporting events. For each event, the LLM is tasked with encoding critical information, such as:

• Event Type: This entails categorising the event, whether it’s a fashion show, music concert, sports game, or any other type.

Event Date: Precise date information is recorded to establish the timing of the event.

• Event Location: The LLM captures details about where the event is taking place, whether it’s in the retailer’s city or another location.

Clothing Line Data Encoding: Simultaneously, the LLM encodes information about the retailer’s clothing lines. This encompasses a thorough analysis of their diverse product offerings, focusing on factors such as:

Clothing Type: The LLM differentiates between various clothing categories, such as dresses, tops, pants, and accessories.

Brand Information: It identifies the brands carried by the retailer, distinguishing between different labels and their respective popularity or prestige.

Building the Predictive Model: With the event and clothing line data successfully encoded by the LLM, Predyktable’s data scientists can proceed to build a predictive model. This model is designed to forecast how diverse events will impact the retailer’s sales. Here’s how this works:

• Event-Product Interaction Analysis: By leveraging the encoded data, the predictive model can analyse how specific types of events affect the sales of particular clothing items. For instance, it can identify whether fashion shows boost the sales of high-end designer dresses or if music concerts have a more significant impact on casual apparel.

Time Sensitivity: The model considers the timing of events, ensuring that sales predictions consider both the event’s date and the lead-up time.

• Data Integration: It integrates the event data with other relevant factors, such as historical sales data, customer demographics, and marketing efforts, to generate comprehensive forecasts.

Ultimately, this predictive model equips the clothing retailer with invaluable insights. It enables them to make informed decisions about inventory management, marketing strategies, and event participation.

Predyktable's data in our LLM

6- Conclusion:

Along with text generation and chat, Large Language Models serve as a potent instrument for numerically encoding contextual data, enriching predictive pipelines. Through the utilisation of LLMs, Predyktable elevates its capacity to construct enhanced models that better serve its clientele.

6.1- Further Considerations:

While LLMs continue to evolve, they have the potential to redefine our interactions with computers. Applications like chatbots, capable of comprehending and responding to natural language and precise machine translation systems bridging language gaps, are on the horizon.

Moreover, LLMs wield a substantial influence on the field of artificial intelligence, contributing to the development of innovative AI models like autonomous vehicles and medical diagnostic systems.

The ongoing evolution of LLMs holds promise for diverse and positive impacts across numerous domains, igniting anticipation for the transformative potential they bear on the world.

The Three Pillars of Marketing Optimisation

1.    Introduction:

In today’s rapidly evolving business landscape, marketing optimisation has become an essential strategy for companies to stay competitive and achieve sustainable growth. The three pillars of marketing optimisation – customer high-value segmentation, personalised content, and ad spend allocation – play a pivotal role in maximising the effectiveness of marketing efforts. However, traditional marketing techniques have their limitations when it comes to these pillars. Fortunately, predictive analytics, when combined with external consumer behaviour data, has emerged as a game-changer, empowering businesses to overcome these challenges and supercharge their marketing optimisation strategies.

The Three Pillars of Marketing Optimisation

2.    The Three Pillars of Marketing Optimisation:

2.1  Customer High-Value Segmentation:

Customer high-value segmentation involves dividing your customer base into distinct groups based on their value to your business. Traditionally, marketers rely on their own internal demographic data alone, leading to limited insights and an inability to handle complex customer data sets. Predictive analytics addresses these drawbacks by utilising advanced algorithms and machine learning techniques to analyse vast amounts of customer data from various sources. By integrating external consumer behaviour data, businesses gain a more comprehensive understanding of their customers’ preferences and behaviours beyond their own interactions.

Benefits of Predictive Analytics in Customer High-Value Segmentation:

  • Identifying hidden patterns: Predictive analytics uncovers previously unknown segments of high-value customers based on external behaviour indicators that are relevant to the business.
  • Real-time updates: Continuously analysing internal and external data provides real-time insights into customer behaviour, ensuring accurate and up-to-date high-value segments.
  • Precision targeting: Predictive analytics refines the segmentation process, enabling more precise targeting of customers with the highest potential value based on both historical interactions and current behaviour patterns.
Customer High-Value Segmentation

2.2  Personalised Content:

Personalised content involves delivering tailored marketing messages, offers, and experiences to individual customers or specific segments. Traditional marketing often relies on mass communication and manual customisation based on gut feel, leading to lower engagement and inconsistent messaging. Predictive analytics, combined with external consumer behaviour data, revolutionises personalised content creation and delivery.

Benefits of Predictive Analytics in Personalised Content:

  • Advanced personalisation algorithms: Predictive analytics identifies patterns in external consumer behaviour. Allowing businesses to create more advanced personalisation algorithms and generate content recommendations based on consumers’ interactions with other brands and content types.
  • Cross-platform consistency: By considering external consumer behaviour across different platforms, businesses maintain consistency in personalised content delivery regardless of where customers engage with the brand.
  • Real-time content optimisation: Predictive analytics enables real-time optimisation of personalised content elements to meet consumers’ immediate needs and interests.
Personalised Content

2.3  Ad Spend Allocation:

Ad spend allocation involves strategically distributing the advertising budget across different marketing channels and campaigns to maximise ROI. Traditional methods lack accurate measurement, limited real-time optimisation, and inefficient budget allocation. Predictive analytics, coupled with external consumer behaviour data, revolutionises ad spend allocation strategies.

Benefits of Predictive Analytics in Ad Spend Allocation:

  • Enhanced attribution modelling: Predictive analytics attributes conversions and key metrics to specific advertising channels, considering both internal and external consumer behaviour data, allowing businesses to allocate ad spend to the most effective channels.
  • External market trends: Analysing external consumer behaviour data helps businesses understand broader market trends and target emerging markets or new customer segments with high potential.
  • Real-time optimisation: Predictive analytics provides real-time performance insights, enabling marketers to adjust ad spend allocation on the fly based on changing market conditions and consumer behaviour.

3.    Conclusion:

The three pillars of marketing optimisation – customer high-value segmentation, personalised content, and ad spend allocation – form the backbone of successful marketing strategies. Traditional marketing techniques have their limitations, but predictive analytics, when combined with external consumer behaviour data, offers a powerful solution to overcome these challenges. By leveraging advanced algorithms, machine learning, and real-time insights, businesses gain a deeper understanding of their customers, create personalised and relevant content, and allocate their ad spend more strategically, ultimately leading to improved marketing performance and business growth in today’s dynamic market.

Industry view: what’s really challenging retail & hospitality executives

By Phillip Sewell, CEO at Predyktable

I recently conducted a series of interviews with senior executives working across both the retail and hospitality industries to gain a deeper understanding of the most pressing challenges and priorities they currently face.

As these industries continue to navigate an ever-changing landscape, it’s crucial to understand the perspectives of those at the forefront. From supply chain disruptions to shifting consumer preferences, the insights gleaned from these discussions shed light on the most critical issues facing these industries today.

Here are the why’s and how’s behind the tough decisions these senior executives face, with their fascinating insights distilled in a Q&A below.

1- Given the cost-of-living rise and increased costs throughout your supply chain, how will you remain profitable?

“Many CEOs are ex-CFOs, so unsurprisingly they’re dealing with the cost of living by finding ways to cut expenses and remove services – but without damaging sales or losing customers. In fact, across our outlets we’ve reluctantly increased prices by 10% to offset supply chain costs” CIO – Multi-channel Retailer

“As a direct-to-consumer business, we’ve also put-up prices due to a 500% increase in freight costs. We’re now hedging our bets with our supply chain: trying to lock in fixed prices for 5 years to offset the volatile market. We’re also exploring new territories to offset the challenges globally, and where to invest to reduce operational costs.” CEO – Retail

“Customers always want more for less, but prices are going up and promotions are being increased in what has traditionally been a high peak end of season and new season. This is an indicator of how pub and hotel operators are struggling.” MD – Hospitality

Increased prices in retail and hospitality

“As a multinational restaurant chain, we are changing fees to align more to market realities. We need to focus on new business, we’re extending reach beyond our current portfolio – while growing revenue from our existing customer base.” MD – Hospitality

“As a DIY retailer, we need a more agile, flexible supply chain. We’re focusing on what’s driving value, so we’re looking at things like optimising demand forecasting. We are raising prices and measuring the sensitivity of this, while finding ways to reduce supply chain costs. We are also either reducing advert spend or making it work better.” Marketing Director – Retail

“It’s all about price. We’re having to increase prices by 13-14% per annum across our restaurant brands. It’s difficult to get the second visit during the week, so our pricing is keener. It’s a perfect storm of costs and balancing acts.”  Marketing Director – Hospitality

“We have raised prices, but not too much as we’re a price-sensitive confectionery brand. We’re taking a hit on margin and hope it comes back. In the short term, we’re managing costs to mitigate this. It’s survival of the fittest, you try to hoover up market share and hope you retain it in the longer term.” Chief Growth OfficerHospitality

2- What other issues are you facing today and what are the long-term impacts?

“We must be price sensitive to consumer’s expectations. We’re asking things like what people are willing to accept? How do you quantify the impact service quality has on price points? Managing costs will be critical, and staffing impact in the long-term is a concern. We need to better predict what the labour market will be like in 5 years-time and what changes in our recruitment model can mitigate against this.” CIO – Multi-channel Retailer 

‘Volumes are not where they were, and we’ve been hiking prices. There’s still a role for pubs for informal occasions versus restaurants, but it’s all about getting people through the door.Marketing Director – Hospitality

man and woman having dinner at restaurant

“It’s all about where to find new business. Customers are no longer loyal, basically businesses are just “swapping” customers and not stimulating new growth. We need new revenue and new customers.” Digital Transformation Director – Retailer

“Online will not hold the dominance it once did as the cost of online is becoming less feasible and concerns on the environment increase. We may see a shift back to bricks and mortar to deliver a greater experience.” Global VP of E-Commerce – Retailer

“There is a danger of oversupply in the market for restaurants. After many closures during covid, there’s been some aggressive new openings with new operators mostly in city centres. I think there will be an implosion. Pubs have really got their act together and are well placed to challenge restaurants, they also suit people when they’re working from home.” Chief Marketing Officer – Hospitality

Recessionary impact and labour availability are big issues. Everyone in the industry is suffering, with chefs being the most difficult to recruit. We’re using some central kitchens to produce food consistently and reduce the impact at restaurant level.” Chief Growth Officer – Hospitality

3- What are your key priorities and investments over the next 3 years?

“Technology investment is key. We’re examining which technologies can return ROI – while solving the biggest problems we have. We do need to better understand which areas require investments to plug the leaks in costs.” CIO – Multi-channel Retailer

“Digital tools and online is one area of investment for us, coupled with systems to help labour scheduling. It’s all about making the central and pub teams become more efficient. Capex is being maintained, but it’s now focused on maintenance and improvement or conversion to new offers – rather than new builds.” Marketing Director – Hospitality

“We’re focusing on improving the supply chain. It’s the biggest cost centre and has the biggest negative impact on customer experience. Over 45% of customer care calls cover where is my product? So, having a fantastic supply chain would help address this.” Digital Transformation Director – Retailer

“We’ll be investing in systems including ERP, PIM and re-platforming, to reduce the friction of doing business and enable scale and agility. Improving staff wellbeing is also key, especially as the fight to retain staff becomes increasingly critical. We are improving performance marketing that better connects with customers. Acquisition will also prove key, as the competition becomes increasingly fierce.” Global VP of E-Commerce – Retailer

“We’ll be driving like for like sales, including investing in the fabric of the building or in-restaurant technology that hits our sweet spot. Potential acquisitions are a consideration, with a focus on small operators with decent brands and locations. We’re also trying to find the sweet spot of recruitment and we’ll invest when we’ve got it right.” Chief Marketing Officer – Hospitality

Staying relevant and interesting is core to our strategy. We need a competitive edge versus competitors, so we’ve got to work out what that is and then make it relevant.” Chief Growth Officer – Hospitality

Final thoughts

The COVID hangover means that everyone still has a short-term mentality. That is the sentiment from all those I spoke with. Profitability is now the short-term goal, rather than longer-term strategic planning that existed pre-COVID.

So, with key decisions on spend, labour optimisation, demand forecasting and more, how about the efficacy of current solutions that support decision-making?

All agree that business intelligence and data analytics have helped retail and hospitality executives understand and influence their customers’ buying habits – but only up to a point.

Despite billions of pounds spent globally on data platforms, data repositories and a whole stack of tools, most still lack the help they need to turn data into forward actions that maximise profits. Everyone agreed that more ‘prescriptive’ data insights are urgently required by brands: providing forward recommendations that support more profitable business-critical decisions. 

How to profit from prescriptive analytics in an uncertain world

A recent family trip to a fairground gave me a different perspective on the challenges of retail and hospitality professionals dealing with customers: where all is never as it seems.  

In the ‘fun house’ I was staring at the special mirrors where no matter the viewing angle, what was in front of me had little resemblance to what my mind expected to see. Staring back was something that looked vaguely recognisable, yet when I acted in a familiar way, it behaved in an unexpected manner.  

Remove the flashing lights and over-sugared kids, and this experience struck me as a great analogy of what it’s like for retail and hospitality professionals trying to understand and meet customers’ ever-changing demands. The unexpected shifting of sentiment and expectations, coupled with the fact that the only thing that’s certain is uncertainty, means it’s very difficult to make good business-critical decisions with confidence. 

Until now, retail and hospitality professionals’ decision making has been supported by business intelligence and retrospective data analytics capabilities. What’s lacking from these capabilities are clear, undistorted, short and long-term future views and knowing what to do about them. 

Many of these solutions are constrained as they rely on historic data and insights describing something that has already happened. Why is this no longer good enough? It’s because of ever-changing customer expectations and a two-year gap in historical data due to Covid. It’s why you can no longer afford to look back to move forward.  

Internal, rear-facing insight isn’t enough, you must examine how customers are behaving in the wider world, or why they’re not shopping, eating or staying with you. Brands must start utilising much broader external data sources to understand the impacts of social economic and environmental changes have on customer behaviour in general. Those brands that do, will better understand the value of their actions, compared to the results of inaction.  

With ad spend increasing, the competitor landscape so high and margins so thin; not being certain of your decisions can be very expensive and you’ll miss big opportunities.   

The good news is a relatively new, sophisticated capability, called prescriptive analytics promises to solve these challenges by looking into the future and then recommending the most profitable course of action.  

We’re talking about prescriptive analytics as a managed service blending descriptive, diagnostic and predictive insights, with cutting-edge artificial intelligence, machine learning, automation, genuine data science and in-sector consultancy expertise. Everything is custom built, with each step creating prescription models precisely choreographed to meet an individual organisation’s needs.  

This involves enhancing internal data with much wider external insights including global & local trends: weather, travel, localised demand spikes, and more.  Using this high-quality data, data scientists build and optimise prescription models which identify previously elusive, connected, patterns to deliver the most accurate foresight fuelled prescriptions.  

Data scientists also continually find new insights to keep models relevant, while learning from the data so they keep delivering value. By uniquely aggregating data from a wider range of external sector sources, models are further enriched to provide greater accuracy and depth to foresight.  

Ultimately, this means the prescription models keep getting better – so retail and hospitality professionals keep making the most profitable business decisions.  

Prescriptive analytics can help better forecast demand. Every retail and hospitality professional understands the importance of having the right product, in the right place, at the right time.  But how do they make profitable decisions through the lens of regional demand? 

It means digging deeper than just price, as customers’ expectations are also driven by availability, experience, and ethical considerations. It’s important for retail and hospitality professionals to ask the right questions. What you do? Who are your customers? Do you have stores? Where are they located?   

This baseline information is then enriched with global data including how stores, hotels or restaurants are affected by seasonality, bank holidays, days of the week and more. Individual stores hotels or restaurants are isolated and modelled independently: answering how customers in these areas are behaving. Is it an area of growth? Will it be impacted by reduced disposable income?  

Next up, more dynamic effects are considered, such as weather, tourism, travel disruptions, proximity to transport and event hubs. All this information is combined with advanced models to reveal what regional demand could look like.  

Imagine a scenario where you could utilise external data that tells you the volume of people expected to attend an event near your venue or store. How about understanding the demographics of those attending and the types of products and services they would be interested in. Then imagine layering further insight on what’s driving this increased footfall past your door: such as local traffic disruptions, or weather conditions at this specific time and day.  

Equipped with this foresight, means more profitable localised decisions can be made on staffing, inventory, promotions, and pricing. 

There’s so much value that can be generated with prescriptive analytics as a service, how about achieving these outcomes for starters: 

  • Accurately forecast future demand 
  • Enhance customer experiences 
  • Boost sales and profits 
  • Increase satisfaction and loyalty  

Expect a return on investment in months not years. McKinsey’s research asserts that prescriptive analytics is poised to continue to deliver strong return on investment and become an increasingly important tool for businesses. 

Whatever your size, whatever your uncertainty, Predyktable delivers fully managed prescriptive analytics as a service. We generate actionable foresight faster to address your specific needs, without complexity and compromise. We’d love to hear how we can help your retail or hospitality brand make more profitable decisions. 

Let’s Talk About the Power of Sentiment

Sentiment analysis has never really gone away, but it’s certainly seen a strong resurgence as social media has grown to become a core channel for so many brands and consumers the world over. Before social media, where exactly did we garner data for sentiment analysis?

Believe it or not, SA has been around since the 1950s, when it was primarily used on paper documents. Over the decades, it’s closely followed the channels and communities in which we express ourselves. By the birth of the internet, the use of SA adjusted to include the early channels of the social web, such as forum posts and online articles.

Today, it’s difficult to comprehend just how many sources and data points can be included in SA. But this works in our favour for several reasons.

  1. The more data we can gather, the more accurate we can be in our reporting
  2. The more channels and sources we can monitor, the more broad and diverse our data
  3. The more choice we have, the more we can customise our requirements

Sentiment analysis is one of the most valuable exercises in making your brand or organisation more customer centric. It’s a direct line to the collective voice of the consumer, whether they’re bought from you, plan to buy from you, and even if they aren’t planning to buy from you.

We must remember that positive, neutral and negative data are all good data. Brand strategy isn’t just built on why people want your product or service, but also why they might not want it too. And it’s important to use that data to shape your marketing and ultimately, your own voice.

It’s a very valid question. Can we have too much of a good thing? In our view, the current SA landscape is a wild, wild west. With new platforms springing up left, right and centre (notable new additional in the last several years include sites such as Glassdoor and Polywork) and entirely new ways of sharing content can completely disrupt a channel (we’re looking at you, TikTok), it’s tough to keep up, and it’s even tougher to sift through so much data.

That’s where Predyktable helps. Traditional SA methods can be time-consuming, confusing and sometimes inaccurate due to undetected anomalies. Our model accounts for all of this and takes the hard work and frustration out of SA. In fact, we go a step further, taking SA way beyond social media, a point at which many traditional SA businesses stop and send you their invoice. 

The concept of monitoring such a broad range of sources, including blogs, reviews, call centre logs and even search engine terms, might be a little daunting. All of that unstructured data from so many different kinds of people using so many different channels. 

This is where our data visualisation tool comes in, breaking down broad and complex models into easy understandable data, which most importantly, is actionable. 

This quite literally gives you a clearer picture of pain points, frustrations, experiences and more. And with the increasing power of not only our tools, but the social channels that your customers are using, we’re able to segment by demographics such as region, gender, age and even lifestyle.

Our sentiment analysis is different, and we’re incredibly proud of what we’ve built and will continue to build upon and improve. Imagine being able to understand your brand’s reputation across such a broad range of sources. Imagine being able to spot potential areas of growth and investment, and to be able to act upon them now rather than later.

And perhaps most importantly in this day and age, imagine being able to spot negative sentiment and being able to deal with it there and then, long before it spirals out of control and causes damage to your brand. (To be clear, we’re not talking about something malicious like covering up bad reviews, we’re talking about an interactive and agile approach to your brand strategy, and making positive improvements to your product or service as a result of negative feedback.)

It’s also important to consider how SA can open you up to more modern ways of marketing. For instance, imagine being able to identify key social media influences to champion your brand. In the age of influencer marketing, this isn’t a channel to be ignored. Our platform is so powerful that it will even ensure the influencers that you’re seeing within your data have been fully verified for reach and engagement, ensuring you work with the right people.

In a noisy, crowded world of challenging reviews, increasing customer expectations and online channels where opinions fly overhead like rockets, it’s never been more important to harness the power of sentiment analysis. What’s difficult is finding the right people and platform to cut through the noise and make sense of it all.

Thankfully, that’s us. If you’re looking for a technology service that has its ear to the ground and will go the distance with you, let’s talk soon.