hckrnws
This corresponds to new functionality in the OS updates today around dimming flashing lights
https://9to5mac.com/2023/03/27/apple-releases-tvos-16-4-with...
https://developer.apple.com/documentation/mediaaccessibility...
For epileptics like myself, this is a huge deal. It helps me deal with a lot of media that otherwise might cause me to suffer a seizure.
Could flashing lights be the new blue light? Even for non epileptics, flashing lights might be associated with eye strain, sleep disruption, migraines, and stress. I may turn this feature on.
This. I know I'm page-rage and even navigate away from webpages or videos with too much flash (whether by accident due to loading issues or by design).
I'm not epileptic.
I basically only encounter this when I use a machine without an ad blocker. (And holy ** is the Internet bad without one.)
I’m not epileptic but I find flashing lights extremely unpleasant and will always avoid or look away if I encounter them.
Like most accessibility features, you don’t need to have a condition to find value in this one
Are there any comparable tools or filters you find useful now? I have an old epileptic cat on my couch, and I’m always worried subjecting her to my horrible old anime. I was just yesterday thinking a filter like this was probably out there, and if not, might be easy to make. Looking for VLC plugins, though, I don’t see anything
VLC:
Tools > Effects and Filters > Video Effects > Advanced > Anti-Flickering
Tools > Preferences > Show settings: All > Search: antiflicker
mpv --vf=deflicker,scale=960x540,pixscope 'https://github.com/apple/VideoFlashingReduction/blob/main/VideoFlashingReduction_Mathematica/Resources/movie.mp4?raw=true'
(generate test video): ffplay -f lavfi -i "color=777777[x],color=cccccc,
[x]framepack=frameseq,settb=AVTB,
setpts=N/TB/${flickerrate:-12},framerate=${interpolated:-24},
deflicker=size=5:mode=am,scale=640x480,pixscope"
antiflicker and deflicker are different implementationsDoh! there's a better filter for this: photosensitivity. It detects local brightness changes that sum to 0, unlike deflicker and apparently the Apple algorithm.
drawgraph can be bodged to convert metadata to frames, and signalstats the opposite. This enables silly things like a self-contained filtergraph mimicking Apple's demo with plots and usable in mpv.
frames=5
sensitivity=1.5
dim_range=2
dim_intercept=-0.4
smoothing=10
smoothtype=am
decay=0.9
min_contrast=0.0
min_brightness=0.0
min_saturation=0.0
ffplay -vf "split[original], crop=640:360:0:0,
split=5[i1][i2][i3][i4][i5];
[i1]photosensitivity=$frames :$sensitivity,
drawgraph='bg=black@0 :r=60 :s=1x1 :m1=lavfi.photosensitivity.factor
:fg1=0x010101*round(0xff*clip(VAL*$dim_range+$dim_intercept,0,1))',
split[sensedat], deflicker=$smoothing :$smoothtype,
negate, lagfun=$decay, negate, split[mit1][mit2];
[mit1][i2]scale2ref[mask][i2]; [i2][mask]alphamerge[m];
[mit2]signalstats[mitdat];
[i3]eq=$min_contrast :$min_brightness :${min_saturation}[a];
[a][m]overlay, split[output1][output2];
[sensedat]drawgraph=lavfi.photosensitivity.badness :0xffff0000
:bg=white :r=60 :slide=scroll :size=512x256 :min=0 :max=7[bad];
[bad][i5]scale2ref[badout][reuse];
[badout]drawtext=fontcolor=2222dd:text='frame %{metadata\:lavfi.photosensitivity.frame-badness}'[badout];
[badout]drawtext=fontcolor=0000ff:y=1.2*lh:text='badness %{metadata\:lavfi.photosensitivity.badness}'[badout];
[badout]drawtext=fontcolor=777700:y=2.4*lh:text='factor %{metadata\:lavfi.photosensitivity.factor}'[badout];
[mitdat]drawgraph=lavfi.signalstats.YAVG :0xff00ccdd
:bg=00000000 :r=60 :slide=scroll :size=512x256 :min=0 :max=255, vflip[factor];
[factor][badout]scale2ref[factorout][badout];
[factorout]drawtext=fontcolor=999900:y=3.6*lh:text='smoothed %{metadata\:lavfi.signalstats.YAVG}'[factorout];
[badout][factorout]overlay[rplot];
[i4]signalstats, drawgraph=lavfi.signalstats.YAVG :0xffff0000
:bg=white :r=60 :slide=scroll :size=512x256 :min=0 :max=255[lum];
[lum][reuse]scale2ref[lumout][reuse];
[lumout]drawtext=fontcolor=ff0000:text='%{pts\:hms} %{pts} %{eif\:n\:d\:5}'[lumout];
[lumout]drawtext=fontcolor=0000ff:y=1.2*lh:text='avg luma %{metadata\:lavfi.signalstats.YAVG}'[lumout];
[output1]signalstats, drawgraph=lavfi.signalstats.YAVG :0xff00ccdd
:bg=00000000 :r=60 :slide=scroll :size=512x256 :min=0 :max=255[mlum];
[mlum][lumout]scale2ref[mlumout][lumout];
[mlumout]drawtext=fontcolor=999900:y=2.4*lh:text='mitigated %{metadata\:lavfi.signalstats.YAVG}'[mlumout];
[lumout][mlumout]overlay[lplot];
[reuse][output2][rplot][lplot]xstack=4 :0_0|w0+30_0|w0+30_h0+20|0_h0+20, [original]hstack" \
'https://web.archive.org/web/0id_/https://developer.apple.com/accessibility/downloads/video-flashing-reduction-fig12.mp4'
There’s this chrome plugin for YouTube videos. It stops the video before trigger scenes so you might find it a bit interruptive depending on your (or your cat’s) sensitivity.
https://chrome.google.com/webstore/detail/seizafe-epilepsy-a...
To the best of my knowledge, there's nothing else that's realtime like this for end users?
There are other offline tests like the Harding test that are run by the content creators (and why you see the epilepsy warnings at the start of shows)
FWIW, I'm not sure if this actually would benefit cats. I'm not up to speed on feline perception, but I know dogs see at a different frame rate than humans so things that won't trigger us might trigger them differently
Comment was deleted :(
This might be helpful: https://www.doesthedogdie.com/are-there-flashing-lights-or-i...
I was so happy and pleasantly surprised to see this feature come out today! I just spent the last week telling a client they can’t put their flashy new brand video on their website and still meet AA guidelines.
I told them to do a Harding Test and get the company who made it to do the edits because WCAG doesn’t leave any wiggle room when it comes to potential seizures (rightfully so).
Glad to see Apple leading in this front. These features aren’t sexy but they make a real difference in people’s lives!
Some more documentation on this api is here: https://developer.apple.com/accessibility/#whats-new
If anyone is curious how this actually works, I found the included Mathematica PDF to be the most helpful:
https://github.com/apple/VideoFlashingReduction/blob/main/Vi...
Also a higher-level overview: https://developer.apple.com/accessibility/downloads/Video-Fl...
I’m not sure what’s more impressive, the research or simply that Apple has public documentation on something.
Presumably this is the algorithm powering a new feature in tvOS 16.4:
> This update adds Dim Flashing Lights, an accessibility option to automatically dim the display of video when flashes of light or strobe effects are detected, and includes performance and stability improvements.
Next up: sound leveling?
Settings > Audio and Video > Reduce Loud Sounds
I wish it worked even better to reduce sounds and increase the volume of dialogue. Background sounds and special effects sounds are still so much louder than barely audible dialogue.
I totally agree. I use ffmpeg's dynaudnorm[0] filter in mpv/IINA when watching media to better normalize sound volume. It's the best audio normalizer I’ve used. No loss of dynamic range within the target window duration. (~15 seconds by default, but I tweak it to around ~8 seconds)
If you increase volume on the center channel while reducing overall volume the dialogue usually gets much clearer.
Is there a tool that lets me analyze video and make this change, or does it generally require an audio interface?
I find I have this problem with Chromecast and YouTube. I think I would be pretty happy if I could set up yt-dlp to grab video, assist the audio of needed and then play the video.
Bless you. I appreciate this, and my mother who has some neurological issues related to auditory processing definitely appreciates it.
R 128 (https://en.wikipedia.org/wiki/EBU_R_128) already exists for that.
This is nice.
It would be nice if apple also addressed PWM and Temporal dithering on the displays of their newer devices.
I see some pretty weird stuff on my M1 MacBook when toggling between extreme contrast ranges.
Some days I wish I lived on the other side of whatever bell curve they use to determine how obnoxious time-domain techniques are permitted to be for humans.
Me too, and by "weird stuff" I mean "nasty nasty headaches".
Yes! I downgraded to an iPhone 11 when I finally identified Apple’s use of PWM on their OLED iPhones as the cause of my eye strain. No issues since.
The only iphone that is currently sold that is not OLED is the SE (I think). I don't know what the upgrade path is beyond these.
"not apple" seems like a pretty solid upgrade path if apple are actually (accidentally?) hurting you?
How the hell that company has apologists given how incredibly cynical and miserable they are and have always been I will never know.
"Wow, you can get away with that? How?!?" --Microsoft on seeing apple behavior.
There should be an option to control and limit the severity of flashing that is brought about with the flash light for notifications accessibility setting. I like to use it because I don't want to make my phone obnoxiously loud, because i can't hear its vibration, and I still want to get a chance to perceive the notifications, but usually the light is too damn bright and I do worry it could trigger epilepsy in innocent passersby. Having an alternative flashing behavior like a smooth pulsation would be excellent.
Nice! I've wanted this for years. Most recently, this last weekend, while watching a '90s broadcast TV episode.
I'm not epileptic, but I don't like to take chances with things like that. And, maybe a few times a year, I'm concerned because I see someone doing rapid flashing lights, and it seems irresponsible.
In the last two years or so there have been a number of thing I’ve watched (sorry don’t remember specifics) where I actively looked away because the flashing was just uncomfortable to watch. I explicitly remember thinking “I guess this proves I don’t have photosensitive epilepsy.”
There’s a good chance I will turn this on just for comfort.
Great example of how investing in accessibility for specific groups has off-target benefits for lots of other people!
The curb-cut effect! https://en.m.wikipedia.org/wiki/Curb_cut_effect
The 70's movie Dark Star even comes with a warning about flashing lights at the beginning, so it would be a great test case for this feature.
And a great fun movie, so it's all gravy.
Yea I have the same problem with lots of flashing.. it just tires my eyes so much. I even struggle sometimes with walking past emergency vehicle flashing lights at night and need to look away because jeez my eyes.
So far so good on the epilepsy front tho. Thankfully.
I feel the exact same way; I'm not epileptic but I avoid media that knowingly contains rapid flashing lights out of precaution, and I look away or scroll down whenever I do come across rapid flashing lights.
I'm glad that Apple has made flashing light detection and auto-dimming technologies more available. It's amazing, in a sad way, how so many people are still unaware of the dangers of certain flashing light patterns, which leads to incidents continuing to occur. I personally only know about such dangers because I'm a long-time Pokémon fan since the English dub started airing in 1998; to this day the only episode of the Indigo League era I haven't watched is the infamous Porygon episode (only aired once in Japan) that had a flashy battle scene that sent about 700 people to the hospital. Yet even with this high-profile incident, I've heard some people say the incident was an urban legend until I told them otherwise, and there have been incidents in other media. There needs to be more awareness about the dangers of flashing lights so that way media creators and broadcasters can avoid them or mitigate their risks.
If you're not epileptic, then what exactly is the precaution? That you might actually be epileptic?
I don’t think that episode was ever considered an urban legend. It’s well documented.
https://en.wikipedia.org/wiki/Denn%C5%8D_Senshi_Porygon?wpro...
However the thing people do talk about is how severe it was, and a small percentage reactions may have been the result of hysteria.
> I'm not epileptic, but I don't like to take chances with things like that.
What are the chances you don't like taking, chance of what?
This has been a thing for a while in the broadcast world. There was a piece of gear that could be used to create a report that could be generated to show that the content did not have the strobing that cold cause issues. It's been a long time since I've been involved to know the current specs, but I know platforms like iTunes, Netflix, et al wanted this as well. Disney was a stickler for it since it was a lot of animation and kid oriented, but I might be confusing my time with their broadcast tapes vs streaming masters. As far as I'm aware, if the content did not pass this test, it was sent back for editing vs just adjust some settings on playback. so this is interesting that it's a realtime levels adjustment
There’s an online test called the Harding Test that’s the standard now. It’s required for broadcast media in the UK.
Hmm everything is based on the average normalised luminance of each frame. So if half the frame is flickering in inverse phase, it says there is no risk? That seems unlikely to me.
Nevertheless a useful piece of code - glad to see it.
Not entirely relevant, but just as a point of interest, broadcast television in the UK has required this at least since the 80s. Most TV produced in the UK avoids it happening in the first place, but they have a filter in the broadcast chain that cranks down the contrast whenever bright flashing is detected. It's particularly noticeable when they show American movies - gunfire flashes are muted to an odd grey colour.
I wonder if this applies to all videos / video players on macOS, or just when using the Apple "TV" app, Safari, etc.
Maybe Tesla could make use of this..
I don't think it works on stopping the blue and red flashing lights that come up behind you from using launch control
I meant the crashing into fire trucks and other emergency vehicles thing
Mine sounds more useful /s
They should introduce this in their e-paper tablets. Page turns often cause a lot of flashes.
Apple has e-ink tablets? If you're thinking of e-readers like Kindle, a) most devices released in the past ten years have muted page turns with little or no flashing, b) even on very old devices, a page turn is a single gray-black-gray transition occupying a small part of the visual field, which is very unlikely to trigger a photosensitive seizure.
I don't need this but I think it is awesome if it helps people!
Immediate thought: knowing the algorithm, make a flashing light video that it wouldn't detect.
Also reminds me of an epilepsy support forum that got hacked and defaced with flashing animated gifs back in 00s.
With implementations in... Swift, Mathematica and MATLAB. I would almost be inclined to complain if I didn't live in an age where I could paste the abstract into ChatGPT and get a version in my conlang of choice.
And without ChatGPT, what would you be complaining about? Just whining to whine when they provided 3 implementations and not 4?
You're right, I should be thankful they didn't use COBOL for all three.
Again, I'll ask, what's the complaint (in a ChatGPT-free world to go with your first comment) if someone provides implementations in only a handful of languages? What is your actual complaint or do you just enjoy making non-serious whining posts?
Nothing, I guess. I was only "almost" inclined to complain anyways, so I don't get why you see the reason to persecute me. The minor quirk that made me mad is that a FAANG company wrote something in 3 times but didn't once manage to choose a language people use. It's like if Microsoft open sourced their new autocorrect library for Visual Basic, C# and Excel - it's readily apparent why those languages would be chosen.
So, nothing is wrong with it. I'm pissing and moaning about getting trolled by the big company again, because in this instance I think it's more intellectually stimulating than giving $FAANG_CO a consolation prize.
Apple implemented it in Swift, because they needed it for a feature in iOS. They also have Matlab and Mathematica versions, because mathy types like those languages.
without a doubt the strangest take. they released open source implementations of an a11y feature, and did so in languages designed with mathematic computation, and you’re mad because FAANG + didn’t use rust?
wut
Outside of your deliberate misinterpretation of my point, yes. Apple could have written it in 10 languages and they still wouldn't choose something open or widely used.
It's one thing to write it in Swift, but then rewriting it in Matlab and Mathematica before generously releasing it to the public is just a slap in the face. Maybe that's lost on the denizens of today's Hacker News, but it bothers me anyways.
Well of course it does. It's Apple.
It's not "trolling" to provide an open-source implementation of something in languages that you personally don't use.
No, but it is trolling to "generously" port that code to two proprietary languages before releasing it. Were it not for the cookie points they earn for opening accessibility code, they may as well have never released this.
> No, but it is trolling to "generously" port that code to two proprietary languages before releasing it
No, it's not.
Okay, I guess.
Comment was deleted :(
Crafted by Rajat
Source Code