The architecture of the Sofia Conversational AI Platform contains four main layers in its core and four supporting layers that serve the purpose of interfacing to users of the platform and integration with other enterprise applications. The four main layers include the Conversation Navigator, the Conversation Manager, the NLU module and the NLG module. The four supporting layers include the Channel Aggregator, the Data Layer, Sofia Analytics and the Sofia SDK module.
This is where the switching between sub-conversations happens. At a given time, different paths are selected according to outcomes of the current sub-conversation.
This is where instances of sub-conversations are created and simulated. Internals of sub-conversation are modeled in terms of a processing pipeline and the Conversation Manager layer creates simulation spaces for sub-conversations based on corresponding processing pipelines.
Natural language understanding is actually a part of the processing pipeline and serves directly under the NLU stage or indirectly under all the other stages of the processing pipeline. We use a transformer based encoding to achieve accurate results.
This again plays a role in the processing pipeline, specifically under the “Fact Base”. Currently you can configure a fact base as a simple paragraph and NLG generates responses based on the found facts and user’s query. We have come up with a unique model that combines the user query and found facts together with glueing words that might not be present in the user query or in the found facts (sentence fragments) to come up with a response that makes sense while providing the answer to the user query.
Conversational AIs can be deployed on different channels such as social media, web application, mobile
telephony. In different channels, voice or text or a combination of both can be used as the method of
communication between the user and the AI. Channel Aggregation layer acts as a hub of connecting different
channels to the AI and manages sessions accordingly.
The data layer gets the inputs directly from the Channel Aggregator. This gives a huge advantage in terms of scaling and introducing new channels to the Channel Aggregator since Channel Aggregator can introduce per-channel preprocessing and session management strategies.
The data layer collects logs, data and statistics. Data is mainly defined by the entities extracted by the
AI engine but custom data points can be captured according to Sofia Markup Language (SoML) tags present in
responses generated by the AI.
When it comes to analytics related needs, it is very challenging to capture the statistics from the user-AI conversations. Current conversational AI platforms do not provide a unified solution for this problem. In our platform, SoML can be used to tag events (such as detection of an entity, branching into certain paths of the chat flow and many more) and under these tags different statistics are collected. This approach will give you the flexibility to derive almost “any” statistic from conversations in a super convenient way.
Sofia Analytics is a framework to analyse the statistics derived from conversations between AIs and the users. Sofia Analytics web application allows authorised users to configure an analytics dashboard combining different “Analytics Tags” with a graph type. The Analytics Tags are the same tags as in the Data Layer, under which the statistics are collected. The analytics dashboard is therefore configured to analyse almost anything by proper manipulation of the Analytics Tags. In a practical deployment of conversational AIs, you can easily have analytics such as product demand as a pie chart, product interests and trends as an area chart, the main conversation paths users follow as a bar chart and much more.
It is very obvious that almost any conversational AI application requires some degree of integration to an existing system or a new conventional interface. With this in mind, Sofia SDK is developed, providing you access to the Data Layer and the AI itself. At the moment, the SDK only supports NodeJS, but in the future, we are planning to cover other technologies as well.