Similar Models

Distil-Qwen3-4B-Text2SQL

distil-labs/distil-qwen3-4b-text2sql-gguf

Task-specialized 4B model for natural-language-to-SQL conversion. Distilled from DeepSeek-V3. Quantized GGUF for local database agents.

How to Use

To get started, install the `transformers` library:

pip install transformers

Then, use the following snippet to load the model:

from transformers import AutoTokenizer, AutoModelForCausalLM

model_id = "distil-labs/distil-qwen3-4b-text2sql-gguf"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)

# Your inference code here...

Available Versions

Tag / VariantSizeFormatDownload
distil-labs/distil-qwen3-4b-text2sql-gguf:Q4_K_M3.1GBGGUFLink
distil-labs/distil-qwen3-4b-text2sql-gguf:Q5_K_M3.6GBGGUFLink

Model Details

Teacher Model

DeepSeek-V3

Distillation Method

Knowledge Distillation (Logits)

Training Dataset

Flickr30k (Conceptual)

Primary Task

Multimodal Generation

Performance Metrics (Example)

MetricStudent ModelTeacher Model
Model Size3.1GB8.5GB
BLEU Score28.530.1