🎯 ByteCNN-8KB Toxicity Classifier
Ultra-compact neural network for real-time toxicity detection
7.1KB
Model Size
1,809
Parameters
80.0%
F1 Score
6.87ms
Inference
Enter text to classify:
Toxicity Threshold:
0.5
🔍 Classify Text
🗑️ Clear
Quick Examples:
✨ Friendly greeting
🙏 Appreciation
⚠️ Toxic insult
😡 Hateful comment
👍 Positive feedback
🔥 Severe toxicity
🔄 Classifying text...
Confidence
Processing Time
Text Length
Threshold Used