Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
1
2
PDS DPO
pdsdpo
Follow
https://pds-dpo.github.io/
AI & ML interests
None yet
Organizations
None yet
models
2
Sort: Recently updated
pdsdpo/SynthAlign-7B
Image-Text-to-Text
•
Updated
Dec 26, 2024
•
28
•
1
pdsdpo/SynthAlign-7B-LoRA
Image-Text-to-Text
•
Updated
Dec 26, 2024
•
12
•
1
datasets
2
Sort: Recently updated
pdsdpo/synthalign_v1_1_data
Viewer
•
Updated
Jul 29, 2025
•
12.4k
•
101
pdsdpo/synthalign-v1_0-data
Viewer
•
Updated
Jun 29, 2025
•
23k
•
104