More is Less: The Pitfalls of Multi-Model Synthetic Preference Data in DPO Safety Alignment

Add code
Apr 03, 2025
Figure 1 for More is Less: The Pitfalls of Multi-Model Synthetic Preference Data in DPO Safety Alignment
Figure 2 for More is Less: The Pitfalls of Multi-Model Synthetic Preference Data in DPO Safety Alignment
Figure 3 for More is Less: The Pitfalls of Multi-Model Synthetic Preference Data in DPO Safety Alignment
Figure 4 for More is Less: The Pitfalls of Multi-Model Synthetic Preference Data in DPO Safety Alignment

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: