Bipping is the act of stealing from cars, often through the use of “splashing” (breaking the car windows).
pbbananaman
Just the American ones though, right? 🤣
All of these articles are making it up then? https://www.nytimes.com/2023/08/30/business/voice-deepfakes-bank-scams.html https://www.cnn.com/2023/04/29/us/ai-scam-calls-kidnapping-cec/index.html https://abcnews.go.com/Technology/experts-warn-rise-scammers-ai-mimic-voices-loved/story?id=100769857 https://finance.yahoo.com/news/ai-supercharges-voice-scams-making-it-easier-for-americans-to-fall-prey-143635279.html https://www.cbsnews.com/news/scammers-ai-mimic-voices-loved-ones-in-distress/
I swear people here are either too young or didn’t use the internet 8 years ago. All of this stuff was super common to search and get the first result back as the right answer.
You’re imagining a future where screen resolution doesn’t improve and lenses can’t solve these issues? Are people really this short sighted?
You could easily create a package that couples the authenticated device with a screen showing the faked images and bring that around. If there is a market for inauthentic images that appear authentic, people will easily bypass this technology.
Just take a picture of your manipulated picture/video from the Sony phone. This does not guarantee anything of value.
I found this repo interesting just for the sake of centralizing a lot of useful info around VHS. Even if you don’t follow this path, the knowledge might help: https://github.com/oyvindln/vhs-decode
Thinking about this more , you probably want this to develop a curve in your color space that represents something with constant CMYK values for your chosen light source.
https://python-colormath.readthedocs.io/en/latest/conversions.html
E.g. your sodium light is 100% yellow, 10ish % magenta. Any color that varies cyan from 0%-100% and black from 0%-100% should presumably not reflect any additional color information (since the source light doesn’t have any cyan and black is just giving brightness)
I also think this means that as long as you hold Y and M constant, you can vary cyan and black for your comparison colors that will look the same. If you try to vary cyan and yellow or magenta at the same time then your effect probably won’t work.
This is tricky because you have multiple curves in the color space that are valid when just considering a single wavelength. The reality is, your lamp emits a spectrum of light (sharp, but still has a width). There’s also the variability in perception. But I’m not sure what the “bandwidth” of our eyes is and what color resolution humans are capable of detecting.
Low pressure sodium lamps have a pretty sharp spectrum: https://en.wikipedia.org/wiki/Sodium-vapor_lamp
Looking at the color spectrum, have you just tried and colors in the green to blue to purple range? I don’t think you need a Python library for this, I think you need to experiment. There’s a lot of dependence on the reflectivity of the material you’re looking at in addition to the color you see under sunlight or even indoor light with broad spectrum.
Try blue and green and see if both look the same under the lamp.
For software, if you’re used to big tech wages, you’re not taking less than $180k base. RSUs are probably somewhere like $400k initial grant, anywhere from 25k-100k yearly refresher. US engineers (good ones) are the furthest from cheap.