Besides the possibility of extracting the private key from the camera, doesn't this still leave the analog hole wide open? One could set up a high-DPI screen or similar in front of the lens, displaying anything they want, and have it captured with a genuine signature. This effort seems much more noble than DRM schemes, but ultimately it's the same fight against reality.
Good comparison with DRM. Perhaps it’s fairer to frame C2PA as an authenticity factor alongside other factors like a creator identity or publication credibility?
You can just generate an AI photo, print it and hold it up in front of a camera and now it's "not" AI any longer and it's a real photo taken with a real camera. I don't put any value in C2PA for AI detection.