When we have application i,e, in nodejs, typescypt, java or python and we want to add AI capabilities in that application we generally used to make an api call to LLM models i.e. cloude, openai, gemeini, deepseek, llama etc. But this making call from app to LLM is not easy and simple. Additional in real case scenario AI application we generally depends/interact with many LLM to generate the result. If we dont use the frame work we need to install different SDK/npm for different LLM as interaction with cloude is different with interaction with openai and gemeini and deepseek. This is combursome and also not manageble if any of the LLM changes we need to make the core code of the application change that may break down the funtionality. Further more to make the our app production ready it is better to use the proper framework so that we get the benefit of Framework strategies in our application i.e.
Simplified Development:
Frameworks offer pre-configured functions and libraries, reducing the need for developers to code everything from the ground up.
Standardization:
They provide a consistent development workflow, enabling easier integration of AI elements into various platforms and applications.
Efficiency:
Frameworks accelerate the development process by providing tools for debugging, testing, and data visualization, allowing for faster iteration.
Accessibility:
Many frameworks are open-source and have large communities, making them accessible and well-supported.
So in short using framework like langchain/langgraph help the AI developer to unified the AI API calling. So now developer just need to download the langchain/langgraph package and this framwork will give us ready made fucntion and class that they can use connecting ad developing AI applicaiton with different LLM's. It also provide ready made packages that can be used by the AI developer for easy, tested and quick development i.e. Langchain has packages like PDFLoader, Vectors with search-DB Store functionality,
Here's an explanation of LangChain, LangGraph, LangFlow, and LangSmith in bullet points:
LangChain:
'-An open-source software framework designed to simplify the development of applications powered by Large Language Models (LLMs).
'-Provides tools and abstractions to connect LLMs with external data sources, APIs, and other computational resources.
'-Focuses on modularity, allowing developers to break down complex LLM tasks into reusable components called "chains."
'-Supports various use cases like chatbots, question-answering systems, content generation, and code analysis.
'-Offers a consistent interface for interacting with different LLMs (e.g., OpenAI's GPT, Google's PaLM).
LangGraph:
'-An extension of LangChain, specifically designed for building "stateful, multi-actor applications" with LLMs, often referred to as AI agents.
'-Uses a graph-based architecture to model and manage complex, multi-step AI agent workflows.
'-Each node in the graph represents a step in the computation, and the "state" feature acts as a memory, tracking information processed by the AI system.
'-Enables the creation of cyclical graphs, essential for agent runtimes where the outcome of one step can depend on previous steps in a loop.
'-Provides more granular control and transparency over the agent's thought process, making it suitable for complex workflows like decision trees or compliance systems.
LangFlow:
'-A low-code (but not the production release vital product), visual framework for building AI agent workflows and RAG (Retrieval Augmented Generation) applications.
'-Offers an intuitive drag-and-drop interface, allowing users to connect different components (prompts, LLMs, data sources) without extensive coding.
'-Empowers developers to rapidly prototype and build AI applications with a user-friendly visual flow builder.
'-Can be used for a wide range of AI applications, including chatbots, document analysis systems, content generation, and orchestrating multi-agent applications.
'-Supports various APIs, models, and databases, and is LLM and vector store agnostic.
LangSmith:
'-A unified observability and evaluation platform for teams to debug, test, and monitor the performance of AI applications, especially those built with LLMs.
'-Provides deep visibility into model behavior by capturing granular data from each LLM interaction and visualizing it through tracing.
'-Helps developers identify performance issues, trace errors, and optimize responses in real-time.
'-Supports testing and evaluation by allowing users to save production traces to datasets and score performance using evaluators (including LLM-as-Judge).
'-Facilitates collaboration on prompts, enabling teams to experiment with models and prompts, and compare outputs across different versions.
'-Offers monitoring capabilities to track business-critical metrics like costs, latency, and response quality.
'- It is similar to what we have in Microservices i.e. zipkin,Jaeger, Pronetheus, Graphana, Kiali for ISTIO (service Mesh Concept).
So in short :-
Langchain :- Langchain is a flow define AI frame work in which AI developer define the flow and it is executed sequence wise i.e. let's refer to the given Image
Let's say you want to build an application that will use gpt4. to generate and transfer it to llama 3 to refine that response an agent that decides whether to fetch external data or generate a response based on the query and memory to store previous interactions with the user now without langchain you would need to manually manage all of these components and write a lot of code to handle the logic APA calls and memory. you'll need to write function to make API call to gpt4 and another function to make API call to LLama 3 and then you'll need to create your own memory and manage it update it and you'll need to write your own code to create agents and all the related tools that agent would use you can easily see the challenges with this approach there will be lot of boilerplate code and you'll need to manage all the API calls and you need to manually code the agent's decision making process and as the logic grows complex the code becomes harder to maintain and scale and that's where Lang chain comes into the picture Lang chain is a open source framework for building applications powered by language model helping developers chain prompts interact with external data and build applications that remember context.
LangGraph:- langgraph it allows AI developer to use the complex flow.
while Lang chain is great for prompt chaining langra excels at handling multiple agents in a more structured workflows if you're building a system where multiple agents interact to solve complex problems example task automation or research assistants you can consider using Lang graph . Lang graph has a concept of graph which has three core components .
1- state :- the state is a shared data structure that represents the current snapshot of the application it maintains information that can be updated and accessed by different parts of the graph a typical State might include user inputs agent outcomes and alist of actions taken throughout the workflow.
2- nodes :- nodes represent the individual components or actions within the graph each node can perform specific task such as executing an llm running a function or interacting with external tools
3- Edges :- edges connect nodes and Define the flow of execution within the graph they determine how data moves from one node to another and this is not a directed graph which means nodes can make decision about which node they want to call next and they can talk to each other back and forth you can use langraph when you need to create agents that require cyclical interactions and decision-making processes it is also ideal for scenarios where multiple agents need to collaborate and work together using langraph in your application is really straightforward you just need to download the package then import it in your project and then start using the classes provided by this library next
LangFlow:-
imagine building AI powered apps like chatbots or data processing tools without having to write code well
langlow makes that possible with its drag and drop interface langlow is built on top of Lang chain and provides a visual interface to build and experiment with Lang chain flows it's perfect for prototyping llm applications and it allows users to quickly design workflows chains agents and test them it is mostly not intended to be used in production but rather for prototyping it's perfect for teams looking to create minimum viable products quickly you can consider other tools like relevance. or defi as well there are a couple of ways you can use langlow first one is using datastack Lang flow of course this is not going to be free but this is one option another option is to install it locally or host it on your Cloud Server you can. find all these instructions in the langlow documentation you can access Lang flow on a UI where you can drag and draw various tools and services you can connect them together and create an entire AI workflow you can then access this workflow using apis from anywhere else so if you have a separate application from where you want to trigger this workflow you can do that you can find all the information how to use this API.
Langsmith:-
building your llm based application is one part but deploying and testing and making sure that all of your agents and llm calls are working as expected that they're not overdoing something and and they are returning the results as expected while also monitoring the number of tokens that are used in each request is really important without that publishing your application to the general public is kind of risky and that's where lsmith comes into the picture lsmith is designed to assist you at all stages of the llm application's life cycle this includes prototyping beta testing and production while langin focuses on building applications lsmith in shorts those applications perform well by offering robust monitoring and if you want to explore lsmith is designed to be independent so you can use it with any llm framework and using Lang chain is not necessary. Howeverif you can connect lsmith with Lang chain and L graph it provides deeper insights into how the workflows are performing helping developers find and fix
Issues now when should you not use lsmith if your application is straightforward and doesn't require extensive monitoring or testing the overhead of lsmith may not be necessary using lsmith in your project is really straightforward all you need to do is install the lsmith imported in your project then set up these two environment variables and then you can start logging your traces from your application by using traceable annotation lsmith will receive all of these traces and in the lsmith dashboard you'll be able to see all the details on how many tokens were used how many calls were made the total cost error rate and latency details you can also monitor trends of the number of calls the number of tokens the latencies ETC using all these graphs.
No comments:
Post a Comment