Michael Goin
Michael Goin's contributions
Article
Sparse fine-tuning for accelerating large language models with DeepSparse
Robert Shaw
+1
Sparse fine-tuning in combination with sparsity-aware inference software, like DeepSparse, unlocks ubiquitous CPU hardware as a deployment target for LLM inference.
Article
SparseGPT: Remove 100 billion parameters for free
Robert Shaw
+1
Compress large language models (LLMs) with SparseGPT to make your machine learning inference fast and efficient. Prune in one-shot with minimal accuracy loss.

Article
Sparse fine-tuning for accelerating large language models with DeepSparse
Robert Shaw
+1
Sparse fine-tuning in combination with sparsity-aware inference software, like DeepSparse, unlocks ubiquitous CPU hardware as a deployment target for LLM inference.

Article
SparseGPT: Remove 100 billion parameters for free
Robert Shaw
+1
Compress large language models (LLMs) with SparseGPT to make your machine learning inference fast and efficient. Prune in one-shot with minimal accuracy loss.