Welcome to the new Splunkbase! To return to the old Splunkbase, click here.
Large Language Model for Splunk app icon

Large Language Model for Splunk

Title: Large Language Model for Splunk Version 1.0.0 Description: This app allows users to pipe Splunk logs into a designated Large Language Model (LLM), allowing users to perform natural language queries using the LLM on the logs (piped to the LLM). This pipeline is based on the Retrieval Augmented Generation (RAG) technique which allows the LLM to reply based only on the given context. Possible Use cases: 1. Get the LLM to monitor your network logs and give you a summary of what is happening in your network. 2. Get the LLM to help with troubleshooting when you are facing error logs. ** Given the large amount and variety of logs in Splunk, there are endless possibilities that you can use with the LLM. Important: Before installing this app, you need to have a pipeline of containers setup. 1. An Ollama Container with nomic-embed-text model and llama3 model(at least, we support llama3, mistral and llama3.1). 2. A Milvus Vector Database container. 3. Our Embed&Retriever; Container. Our Embed&Retriever; Container is a proprietary app developed to interface with Milvus Database and Ollama. Future versions of it will allow it to interface with a larger variety of open source LLM projects like HuggingFace or even OpenAI. Steps: Install Milvus using docker compose file https://milvus.io/docs/install_standalone-docker-compose.md Install Ollama docker pull ollama/ollama Install nomic-embed-text within the ollama container (ollama pull nomic-embed-text) Install llama3 within the ollama container (ollama run llama3) Install Splunk LLM docker pull bendenzer98/splunk_llm:latest To run this container, the container port is 5555, and the environment variables to be set are: llm_URL: "https://:" Milvus_URL: "https://:" Local_Or_NAI: "Local" - Always set Local. Embeddings_URL: "https://:"

Built by Wai Kit Lau
splunk product badge

Latest Version 1.0.0
October 30, 2024
Compatibility
Not Available
Platform Version: 9.4, 9.3, 9.2, 9.1, 9.0
Rating

0

(0)

Log in to rate this app
Ranking

#9

in Artificial Intelligence
Title: Large Language Model for Splunk Version 1.0.0 Description: This app allows users to pipe Splunk logs into a designated Large Language Model (LLM), allowing users to perform natural language queries using the LLM on the logs (piped to the LLM). This pipeline is based on the Retrieval Augmented Generation (RAG) technique which allows the LLM to reply based only on the given context. Possible Use cases: 1. Get the LLM to monitor your network logs and give you a summary of what is happening in your network. 2. Get the LLM to help with troubleshooting when you are facing error logs. ** Given the large amount and variety of logs in Splunk, there are endless possibilities that you can use with the LLM. Important: Before installing this app, you need to have a pipeline of containers setup. 1. An Ollama Container with nomic-embed-text model and llama3 model(at least, we support llama3, mistral and llama3.1). 2. A Milvus Vector Database container. 3. Our Embed&Retriever; Container. Our Embed&Retriever; Container is a proprietary app developed to interface with Milvus Database and Ollama. Future versions of it will allow it to interface with a larger variety of open source LLM projects like HuggingFace or even OpenAI. Steps: Install Milvus using docker compose file https://milvus.io/docs/install_standalone-docker-compose.md Install Ollama docker pull ollama/ollama Install nomic-embed-text within the ollama container (ollama pull nomic-embed-text) Install llama3 within the ollama container (ollama run llama3) Install Splunk LLM docker pull bendenzer98/splunk_llm:latest To run this container, the container port is 5555, and the environment variables to be set are: llm_URL: "https://:" Milvus_URL: "https://:" Local_Or_NAI: "Local" - Always set Local. Embeddings_URL: "https://:"

Categories

Created By

Wai Kit Lau

Type

addon

Downloads

139

Resources

Login to report this app listing