ocassionallyaduck

joined 2 years ago
[–] ocassionallyaduck 1 points 1 day ago

Dunno then my friend. It's not been an issue for me on either OS. But I believe you of course. Good luck figuring it out

[–] ocassionallyaduck 3 points 2 days ago (2 children)

You might want to check if Windows is the culprit.

[–] ocassionallyaduck 13 points 3 days ago

Or your "time clock earth sounds" app from the not so well policed appstore takes silent background screenshots, grayscales them and sends them to their host for OCR.

I agree this permission is annoying. But I differ in I feel it should be system controlled and can be invoked by apps that identify specific fields to be blocked, instead ofnjusy disabling it outright.

[–] ocassionallyaduck 4 points 1 week ago (4 children)

If this is the case for you (I have both in my house), I recommend putting your RokuTV behind a Pi Hole DNS. It will block the TV ad requests at a DNS level while letting content and video go through.

[–] ocassionallyaduck 1 points 1 week ago (1 children)

Your understanding of frame generation is incorrect.

Again let's say a huge absurdly low FPS and a big frame window for example. 10ms between frames.

If your frame windows is 10ms. Frame 1 at 0ms and Frame 2 at 10ms. Frame generation is not just interpolation. That is what your new TV does when you activate motion smoothing and soap opera mode. This is not what framegen is, at all.

In frame generation the frame generation engine (driver or program) stores a motion vector array. This determines the trend line of how pixels are likely to change. In our example, the motion vectors for the ball indicate large motion in a diagonal direction let's say, and the overall frame indicates low or no motion due to the user not swinging the camera wildly. The frame generation then uses frame 1 to make an estimate of a frame 1.5, and the ball does actually move in the image thanks to motion vector analysis. The ball moves independently of the scene itself due to the change in user camera, so the user can see the ball itself moving against the background.

So, in frame 1.5, the ball you are seeing, as well as the scene, have actually moved. Now, the user can see this motion, and lets say they didn't notice it in frame 1. This means frame 1.5 is a chance for them to react! And their inputs go through sooner, reducing true latency by allowing them to react to in-game stimus faster. Yes, even if the frame is "faked"

In reprojection, at frame 1.5RP, again crucially there is not any new scene data. Reprojection is not using motion vectors it's using the camera and geometry only. If the user isn't moving the POV at all for example then the reprojection just puts the frame where it already was and the user waits the full 10ms before the ball appears to move. Even if the camera is moving, reprojection is going to adjust the scene angle relative to camera, the ball is not going to move within the overall scene. Again, consider if the ball is flying left, and the user walking left. The reprojection cannot move the ball left. If anything, if the reprojection is put on the existing scene geometry, the opposite would occur and the ball may even appear to move right or slow down due to paralax.

Reprojection uses old frame data and moves it like flat cards in 3d space, so the frame of the ball in scene the ball stays in position till frame 2. And can only be affect by camera motion that drives reprojection, not other rendering data. And what the user sees of the ball wouldn't change until 10ms later. Only the overall flat scene can reprojection, so the user tilting the camera or swinging it can feel instantly responsive. But till the next render pass, the real motion data, delivered either via motion vector or frame 2, doesn't his them in a reprojection on 1.5.

So again, your understanding of current frame gen is wildly incorrect. And what you are describing for reprojection getting better is essentially to add reprojection to framegen. And use motion vectors to render the new portion of the frame, and use the projection to adjust overall pov based on camera input. Which again, works well. Adding reprojection and Framegen together is not a bad idea. And reprojection is great for reducing perceived latency (why it is essential for avoiding motion sickness in VR). These are two techniques solving different forms of latency issues. Combined they offer far more.

[–] ocassionallyaduck 2 points 1 week ago

No problem. Check out the selfhosted subs here and on reddit for some advice if you want to go that direction.

You don't even need much space or power to start off you're limiting it to photos and docker services mostly. The vast majority of my data use is my entire family's collective media vault. A RaspberryPi 4 with an external 2tb is great for lightweight services and networking.

[–] ocassionallyaduck 3 points 1 week ago (2 children)

I host a bunch of services now, but Plex, Syncthing and the *arr apps have been my standouts for a while.

Syncthing is a NAT-hopping file sync that uses relays to establish communication over firewalls. Windows/Linux. Think of the relays like a FPS matchmaking server to setup your file transfer. It's secure and quite nice. I use it to keep my photos backedup no matter what kind of connection I have. And to sync things like game saves and save states between multiple devices/locations.

Syncthing is great because at the time I didn't have a NAS or anything like immich running for a more complex solution. Syncthing is just folder syncing. So I synced /photos on my phone and C:\photos on my desktop. The first sync will dump the entire camera folder onto my desktop via syncthing (and anything on the desktop onto my phone, if the folder wasn't empty). Then on the desktop side when the phone is starting to fill up, I just move the files from C:\photos into C:\photoarchive for example. Syncthing sees that the files are gone, and tells my phone to remove them to stay in sync, so it does, and I get back tons of space.

If nothing else I recommend at least setting up syncthing backup for preserving your data and not paying google a dime. Then if you go down the self-hosted rabbit hole a bit further, learn how to manage docker and setup an immich instance. At that point you can either us a VPN and access it that way via VPN making it "local" or register a domain and point it at your machine and all that. If you've never self-hosted a website or page before this can be challenging, and the VPN solution is quite simple, and more secure.

All that said, syncthing is not intuitive at first in my opinion. If you make a synced folder entry in the app, the name of that entry doesn't have to match the folder, or between devices. At all.

So on your desktop you might name or rename this link to "receive photos" links it to "C:\photos" but on your phone it's called "sync photos" and links it to "/emulated/0/DCIM/Camera". See what I mean? They look totally unrelated. But if you look closely they both have a matching ID number from when the folder was shared first.

[–] ocassionallyaduck 1 points 1 week ago (3 children)

Frame reprojection lacks motion data. It is in the title. It is reprojecting the last frame. Frame generation uses the interval between real frames, feeds in vector data, and estimates movement.

If I am trying to follow a ball going across the screen, not moving my mouse, reprojection is flat out worse. Because it is reprojecting the last frame, where nothing moved. Frame 1, Frame 1RP , then Frame 2. 1 and 1RP would have the ball in the exact same place. If I move my viewpoint, then the perspective will feel correct, viewport edges will blur and the reprojection will map to perspective which feels better for head tracking in VR. But for information delivery it is no new data, not even a guess. It's still the same frame, just in a different point in space. Not till the next real frame comes in.

With frame generation, if I am watching this ball again, now it looks more like Frame 1 (Real), Frame 1G (estimate), Frame 2 (real) Now frame 1 and frame 1G have different data, and 1G is built on vector data between frames. Not 100% but it's a educated guess where the ball is going between frame 1 and frame 2. If I move my viewpoint, it is not as responsive feeling as reprojection, but it the gained fake middle frame helps with motion tracking in action.

The real answer is to use frame generation with low-latency configurations, and also enable reprojection in the game engine if possible. Then you have the best of both worlds. For VR, the headset is the viewport, so it's handled at a driver level. But for games, the viewport is a detached virtual camera, so the gamedev has to expose this and setup reprojection, or Nvidia and AMD need to build some kind of DLSS/FSR like hook for devs to utilize.

But if you could do both at once, that would be very cool. You would get the most responsive feel in terms of lag between input and action on screen, while also getting motion updates faster than a full render pass. So yes, Intel's solution is a set in that direction. But ASW is not in itself a solution, especially for high motion scenes with lost of graphics. There is a reason the demo engine in the LTT video was extremely basic. If you overloaded that with particle effects and heavy rendering like you see in high end titles, then the smearing from reprojection would look awful without rules and bounding on it.

[–] ocassionallyaduck 7 points 1 week ago (4 children)

Gmail, Drive, and Photos used to be three services that aggressively pushed you to put all your data into them, gave you 10gb each, and in the case of photos offered a nice featured viewer.

Then, they decided to consolidate account storage. Now these services all share a pool of 10gb, and every high quality photo or heavy email from your 10 year old inbox is adding up.

And then, having enshittified it, they start selling you Google backup.

This isn't working

Google One. Still not that popular...

Okay now let's have constant nags across all three services, persistently, that warn you you are running out after like 60 or 70 percent full.

I was done when I heard them almost get my girlfriend at the time, as well as my parents with this dark pattern bullshit. I backed up their data immediate and cleared their devices, setup syncthing and started working on hosting an alternative. I hadn't even learned about immich yet.

[–] ocassionallyaduck 5 points 1 week ago

TBH, on ancient insecure systems that might work.

Not very exciting though

[–] ocassionallyaduck 5 points 1 week ago

Its also way better for you.

Legit, I was so warned about eating disorders when I was young, I never learned to just eat light and how fasting is a thing.

Eat some nuts and enjoy some other stuff. Meat shoumd ve cuts, and it should only 2-3 times a week.

[–] ocassionallyaduck 2 points 1 week ago

Immich, obsidian, ollama, Plex...

Barely scratching the surface but yea, I host everything.

view more: next ›