Picture for Erik Johannes Husom

Erik Johannes Husom

Sustainable LLM Inference for Edge AI: Evaluating Quantized LLMs for Energy Efficiency, Output Accuracy, and Inference Latency

Add code
Apr 04, 2025
Figure 1 for Sustainable LLM Inference for Edge AI: Evaluating Quantized LLMs for Energy Efficiency, Output Accuracy, and Inference Latency
Figure 2 for Sustainable LLM Inference for Edge AI: Evaluating Quantized LLMs for Energy Efficiency, Output Accuracy, and Inference Latency
Figure 3 for Sustainable LLM Inference for Edge AI: Evaluating Quantized LLMs for Energy Efficiency, Output Accuracy, and Inference Latency
Figure 4 for Sustainable LLM Inference for Edge AI: Evaluating Quantized LLMs for Energy Efficiency, Output Accuracy, and Inference Latency
Viaarxiv icon

On The Reliability Of Machine Learning Applications In Manufacturing Environments

Add code
Dec 19, 2021
Figure 1 for On The Reliability Of Machine Learning Applications In Manufacturing Environments
Figure 2 for On The Reliability Of Machine Learning Applications In Manufacturing Environments
Viaarxiv icon