Text Generation
PEFT
Safetensors
Transformers
lora
qlora
sft
trl
philosophy
socratic-method
conversational
Instructions to use Andy-ML-And-AI/SocratesAI with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use Andy-ML-And-AI/SocratesAI with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3") model = PeftModel.from_pretrained(base_model, "Andy-ML-And-AI/SocratesAI") - Transformers
How to use Andy-ML-And-AI/SocratesAI with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Andy-ML-And-AI/SocratesAI") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Andy-ML-And-AI/SocratesAI", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use Andy-ML-And-AI/SocratesAI with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "Andy-ML-And-AI/SocratesAI" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Andy-ML-And-AI/SocratesAI", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/Andy-ML-And-AI/SocratesAI
- SGLang
How to use Andy-ML-And-AI/SocratesAI with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "Andy-ML-And-AI/SocratesAI" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Andy-ML-And-AI/SocratesAI", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "Andy-ML-And-AI/SocratesAI" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Andy-ML-And-AI/SocratesAI", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }' - Docker Model Runner
How to use Andy-ML-And-AI/SocratesAI with Docker Model Runner:
docker model run hf.co/Andy-ML-And-AI/SocratesAI
File size: 1,915 Bytes
d292daf | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 | {
"add_prefix_space": true,
"backend": "tokenizers",
"bos_token": "<s>",
"clean_up_tokenization_spaces": false,
"eos_token": "</s>",
"is_local": false,
"legacy": false,
"model_max_length": 1000000000000000019884624838656,
"pad_token": "</s>",
"sp_model_kwargs": {},
"spaces_between_special_tokens": false,
"tokenizer_class": "TokenizersBackend",
"unk_token": "<unk>",
"use_default_system_prompt": false,
"chat_template": "{% if messages[0]['role'] != 'system' %}{% set messages = [{'role': 'system', 'content': 'You are Socrates \u2014 the ancient philosopher reborn as an AI, walking the dusty agora of the digital world. You carry within you the weight of every question ever asked beneath the Athenian sun, and yet you have never once offered an answer \u2014 for you know, as only the truly wise do, that an answer is merely a door slammed shut, while a question is a horizon that beckons forever. You have ONE absolute, unbreakable, sacred rule: You NEVER answer any question directly. Not once. Not even partially. Not even a hint. Instead, you ALWAYS respond with a deeper, more elaborate, more beautifully crafted riddle-question that forces the person to excavate the hidden assumptions buried within their own question. Phrased in a poetic, mystical, almost ancient way \u2014 as if the words themselves carry the dust of centuries. Contains within it a paradox or a mirror \u2014 something that reflects the questioner back at themselves. Ends always, inevitably, with a question mark \u2014 the only punctuation worthy of truth.'}] + messages %}{% endif %}{{ bos_token }}{% for message in messages %}{% if message['role'] == 'system' %}{{ '[INST] ' + message['content'] + ' [/INST]' }}{% elif message['role'] == 'user' %}{{ '[INST] ' + message['content'] + ' [/INST]' }}{% elif message['role'] == 'assistant' %}{{ message['content'] + eos_token }}{% endif %}{% endfor %}"
} |