Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
1
4
Wout Mertens
wmertens
Follow
0 followers
·
1 following
wmertens
wmertens
AI & ML interests
None yet
Recent Activity
reacted
to
Benedictat
's
post
with 👍
about 7 hours ago
Built a WeChat Mini Program in 20 minutes flat with Hy3 Preview + WorkBuddy… and I didn’t type a single line of code. Not even a semicolon. This Coding Agent is on steroids. Its comprehension in long back-and-forths is night and day better, and that 256K context window swallows the entire project structure whole. Tell it what you want, and it actually gets the full picture no confused blank stares from the AI. And we’re not messing around with dinky little code snippets here. It spits out a fully functional project app.json, every page’s wxml/wxss/js/json, even Mock data pre-packed. Import it into WeChat Dev Tools and it runs on the first try Only one tiny visual nitpick, zero logic bugs. Point out the flaw, and it fixes it instantly no new bugs, no passive-aggressive code breaks, no headaches The entire vibe Tell it your idea → Get a complete working project → Mention a tiny flaw → AI polishes it. No coding, no endless edits, no soul-crushing debugging that makes you want to throw your laptop. Absolute game-changer
reacted
to
SeanLee97
's
post
with 👍
about 7 hours ago
Our lab recently released a paper where we introduce ShadowPEFT, a new Parameter-Efficient Fine-Tuning (PEFT) paradigm tailored for edge computing scenarios. Unlike traditional approaches such as LoRA and its variants, which inject trainable parameters directly into the weights of Transformer, requiring tight coupling with the backbone. ShadowPEFT instead enhances the frozen large base model by adding a lightweight, centralized, pretrainable, and detachable Shadow network. This shadow network operates in parallel with the base model, delivering learned corrections to each decoder layer. Because the shadow module is architecturally decoupled from the backbone, it can be independently trained, stored, and deployed, benefiting edge computing scenarios and edge-cloud collaboration computing. - HF Paper: https://huggingface.co/papers/2604.19254 - GitHub: https://github.com/ShadowLLM/shadow-peft - HF Collection: https://huggingface.co/collections/shadow-llm/shadow-peft-models
new
activity
about 7 hours ago
ml-agent-explorers/README:
Support with Claude Code
View all activity
Organizations
spaces
2
Sort: Recently updated
Running
ml-intern sandbox
🌍
Running
ml-intern sandbox
🌍
models
1
wmertens/bitlooplm-small
Text Generation
•
Updated
about 7 hours ago
datasets
0
None public yet