LOFT: LLM-Orchestrator Open Source Framework
Robust features for chat handling, event detection, including hallucinations, and more. Innovating Backend Systems with High-Throughput & Scalable Functionality. Independent of HTTP frameworks, it ensures scalability with queue-based architecture, supporting rate-limiting for large-scale deployments.
Get in TouchLOFT`s Architecture
Build Impactful Experiences with Leading Large Language Models
Created on a queue-based architecture, LOFT supports rate-limiting and horizontal scaling, making it ideal for large-scale deployments. LOFT is independent of any HTTP framework, enabling limitless possibilities for custom AI implementation in your digital experiences.
LOFT`s Key Features
OMNICHANNEL
LOFT is Framework Agnostic. Seamlessly integrates into any backend system without dependencies on HTTP frameworks.
PERSONALIZATION
Our Framework Provides Dynamically Computed Prompts. Supports custom-generated prompts for personalized user interactions.
SCALABILITY
Facilitates effortless personalized user engagements with LOFT.
EVENT DETECTION & HANDLING
Advanced capabilities for detecting and handling chat-based events, especially hallucinations.
PRIVACY & SECURITY
Framework ensures utmost privacy and high-security standards. We provide enterprise-grade delivery of digital experience.
CHAT INPUT/OUTPUT MIDDLEWARES
Extensible middleware for chat input and output.
Interested in the LOFT framework?
Feel free to fill out the form to arrange a product demonstration.
Book a DemoLOFT`s Conceptions
-
System Message Computers
System Message Computers
This is a callback function that can modify SystemMessage before sending it to the LLM API.
- modify the SystemMessage
- call third-party services or DB queries
-
Prompt Computers
Prompt Computers
This is a callback function that can modify Prompt before sending it to the LLM API for injection.
- can modify the SystemMessage
- could call third-party services or DB queries
-
Input Middlewares
Input Middlewares
This is a chain of functions that can modify the user input before saving it in history and sending it to the LLM API.
- modify the user input
- don't have session or access to the chat history
- could call third-party services or DB queries
- can control the chain of output middlewares by calling the next() callback
-
Output Middlewares
Output Middlewares
This is a chain of functions that can modify the LLM response before saving it in history and sending it to the Event Handlers.
- can modify the LLM response
- able to access the chat history
- may set the custom session context
- is capable of using the `session.messages.query()` method to query the chat history
- can use the `session.messages.methods` to modify the chat history
- could call third-party services or DB queries
- has the ability to callRetry LLM to restart the chat completion job with the modified chat history
- is permitted to control the chain of output middlewares by calling the `next()` callback
-
Event Handlers | Default Handler
Event Handlers | Default Handler
This is a chain of event detectors and registered handlers that can be used to handle the LLM response based on the chat history.
- able to access the chat history
- may set the custom session context
- is capable of using the `session.messages.query()` method to query the chat history
- can use the `session.messages.methods` to modify the chat history
- could call third-party services or DB queries
- has the ability to callRetry() LLM to restart the chat completion job with the modified chat history
- is permitted to control the chain of event detectors by calling the `next()` callback
- allowed to prioritize the event handlers by the `priority` property
- can control the max loops of callRetry() method by the `maxLoops` property
- is the final step of the chat completion lifecycle; in this step, you can send a response to the bot provider webhook or directly to the user
-
ErrorHandler
ErrorHandler
This is a callback function that can be used to handle the error from the Chat Completion lifecycle and inner dependencies.
- able to access and manage the chat history if received a session object
- has the ability to call the retry() method to restart the chat completion lifecycle
- is permitted to control/realize the Message Accumulator to the chat history and continue the chat completion lifecycle - use it only if you can't handle the error and need to try to continue the chat completion lifecycle
- can call third-party services or DB queries
- can respond to the bot provider webhook or directly to a user
- is competent to notify the developers about errors
Why Us?
Master of Code Global is a service company with a product mindset and experience proven by the success of our own products, a suite of apps for the Shopify e-commerce platform that is used by 10,000+ stores globally
Founded in 2004, with more than 250+ Masters globally, and 400+ projects delivered, our solutions have been used by more than 1 billion users worldwide. Our diverse portfolio includes global brands, enterprises, as well as successful startups, such as Golden State Warriors, T-Mobile, Live Person, World Surf League, MTV, Aveda, Jo Malone, Infobip, eBags, Burberry, Estee Lauder, Post, Glia, and others.
Master of Code Global's Open Source Framework is ready for use.
LOFT is already MIT-licensed and accessible. Explore the GitHub repository to learn more about technical details.
Check OutYour Business Vision Meets Technology Mastery Now
page-loft.php