langflow-ai / langflow

⛓️ Langflow is a visual framework for building multi-agent and RAG applications. It's open-source, Python-powered, fully customizable, model and vector store agnostic.

Home Page:http://www.langflow.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Question: Hugging Face API

dyomed93 opened this issue · comments

Hi,
I'm using Langflow to create a ChatBot based on Mistral 7B, but i can't find any documentation or example of the module "Hugging Face API" on Langflow, and what are the exact values to put in Endpoint URL or model.
Where can I find an example or some documentation about that?

Thank You in advance

Hello @dyomed93,

I'm here to assist you with any questions or issues you may have. I'm a bot that can help you solve bugs, answer questions, and become a contributor. Let's see how I can help you today!

You can find documentation or examples for the "Hugging Face API" module on Langflow, specifically for the Endpoint URL and model fields, in the HuggingFaceEndpointsComponent class within the src/backend/langflow/components/llms/HuggingFaceEndpoints.py file.

Here is the relevant code snippet:

from typing import Optional

from langchain.llms.base import BaseLLM
from langchain.llms.huggingface_endpoint import HuggingFaceEndpoint
from langflow import CustomComponent

class HuggingFaceEndpointsComponent(CustomComponent):
    display_name: str = "Hugging Face Inference API"
    description: str = "LLM model from Hugging Face Inference API."

    def build_config(self):
        return {
            "endpoint_url": {"display_name": "Endpoint URL", "password": True},
            "task": {
                "display_name": "Task",
                "options": ["text2text-generation", "text-generation", "summarization"],
            },
            "huggingfacehub_api_token": {"display_name": "API token", "password": True},
            "model_kwargs": {
                "display_name": "Model Keyword Arguments",
                "field_type": "code",
            },
            "code": {"show": False},
        }

    def build(
        self,
        endpoint_url: str,
        task: str = "text2text-generation",
        huggingfacehub_api_token: Optional[str] = None,
        model_kwargs: Optional[dict] = None,
    ) -> BaseLLM:
        try:
            output = HuggingFaceEndpoint(  # type: ignore
                endpoint_url=endpoint_url,
                task=task,
                huggingfacehub_api_token=huggingfacehub_api_token,
                model_kwargs=model_kwargs or {},
            )
        except Exception as e:
            raise ValueError("Could not connect to HuggingFace Endpoints API.") from e
        return output

This class defines the configuration for the Hugging Face API, including the endpoint_url and model fields [1].

Additionally, the Langflow repository provides specific examples and sample configurations for using the Hugging Face API module with different models, including Mistral 7B. Here is a sample configuration for the MistralAIModelComponent:

def build_config(self):
    return {
        "input_value": {"display_name": "Input", "input_types": ["Text", "Record", "Prompt"]},
        "max_tokens": {
            "display_name": "Max Tokens",
            "advanced": True,
            "info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
        },
        "model_name": {
            "display_name": "Model Name",
            "advanced": False,
            "options": [
                "open-mistral-7b",
                "open-mixtral-8x7b",
                "open-mixtral-8x22b",
                "mistral-small-latest",
                "mistral-medium-latest",
                "mistral-large-latest",
            ],
            "value": "open-mistral-7b",
        },
        "mistral_api_base": {
            "display_name": "Mistral API Base",
            "advanced": True,
            "info": (
                "The base URL of the Mistral API. Defaults to https://api.mistral.ai.\n\n"
                "You can change this to use other APIs like JinaChat, LocalAI and Prem."
            ),
        },
        "mistral_api_key": {
            "display_name": "Mistral API Key",
            "info": "The Mistral API Key to use for the Mistral model.",
            "advanced": False,
            "password": True,
        },
        "temperature": {
            "display_name": "Temperature",
            "advanced": False,
            "value": 0.1,
        },
        "stream": {
            "display_name": "Stream",
            "info": STREAM_INFO_TEXT,
            "advanced": True,
        },
        "system_message": {
            "display_name": "System Message",
            "info": "System message to pass to the model.",
            "advanced": True,
        },
        "max_retries": {
            "display_name": "Max Retries",
            "advanced": True,
        },
        "timeout": {
            "display_name": "Timeout",
            "advanced": True,
        },
        "max_concurrent_requests": {
            "display_name": "Max Concurrent Requests",
            "advanced": True,
        },
        "top_p": {
            "display_name": "Top P",
            "advanced": True,
        },
        "random_seed": {
            "display_name": "Random Seed",
            "advanced": True,
        },
        "safe_mode": {
            "display_name": "Safe Mode",
            "advanced": True,
        },
    }

This configuration includes the model "open-mistral-7b" among other options [2][3].

To continue talking to Dosu, mention @dosu.

Hi @dyomed93 ,

Do you need any assistance with this case? If not, please let us know if this issue can be closed.