ramielrowe

joined 1 year ago
[–] ramielrowe 2 points 3 months ago (7 children)

Github and Gitlab are free, and both even allow private repos for free at this point. Git is practically one of the first tools I install on a dev machine. Likewise, git is the defacto means of package management in golang. It's so built in that module names are repo URLs.

[–] ramielrowe 5 points 3 months ago (1 children)

Git was literally written by Linus to manage the source of the kernel. Sure patches are proposed via mailing list, but the actual source is hosted and managed via git. It is literally the gold standard, and source control is a foundational piece of software development. Same with not just unit tests, but functional testing too. You absolutely should not be putting off testing.

[–] ramielrowe 12 points 3 months ago (3 children)

Gotta be honest, downloading security related software from a random drive is sending off sketchy vibes. Fundamentally, it's no different than a random untrusted git repo. But, I really would suggest using some source control rather than trying to roll your own with diff archives.

Likewise, I would also suggest adding in some unit and functional tests. Not only would it help maintain software quality, but also build confidence in other folks using the software you are releasing.

[–] ramielrowe 41 points 5 months ago (4 children)

After briefly reading about systemd's tmpfiles.d, I have to ask why it was used to create home directories in the first place. The documentation I read said it was for volatile files. Is a users home directory considered volatile? Was this something the user set up, or the distro they were using. If the distro, this seems like a lot of ire at someone who really doesn't deserve it.

[–] ramielrowe 3 points 5 months ago (1 children)

I have a similar issue when I am visiting my parents. Despite having 30 mbps upload at my home, I cannot get anywhere near that when trying to access things from my parents house. Not just Plex either, I host a number of services. I've tested their wifi and download, and everything seems fine. I can also stream my Plex just fine from my friends places. I've chalked it up to poor (or throttled) peering between my parents ISP and my ISP. I've been meaning to test it through a VPN next time I go home.

[–] ramielrowe 1 points 6 months ago* (last edited 6 months ago)

Here's a drawing of what I think might be happening to your private traffic: traffic diagram

One major benefit to this approach is CloudFlare does not need to revoke an entire public certificate authority (CA) if a singular private tunnel's Certificate Authority is compromised.

[–] ramielrowe 3 points 6 months ago* (last edited 6 months ago) (2 children)

I somewhat wonder if CloudFlare is issuing two different certs. An "internal" cert your servers use to serve to CloudFlare, which uses a private CA only valid for CloudFlare's internal services. CloudFlare's tunnel service validates against that internal CA, and then serves traffic using an actual public CA signed cert to public internet traffic.

Honestly though, I kinda think you should just go with serving everything entirely externally. Either you trust CloudFlare's tunnels, or you don't. If you don't trust CloudFlare to protect your services, you shouldn't be using it at all.

[–] ramielrowe 10 points 6 months ago* (last edited 6 months ago) (10 children)

Just serve the CloudFlare certs. If the URL is the same, it won't matter. Doesn't matter if you're talking to a local private address like 192.166.1.100 or a public IP. If you're accessing it via a DNS name, that is what is validated, not the underlying IP.

PS. If you tried this and are having issues. We need more details about how things are set up, and how you are accessing them.

[–] ramielrowe 4 points 8 months ago

The entire goal of Server Meshing is fixing the "existing 100 player ones barely function" issue. Server Meshing breaks a single logical "server" into multiple backend servers that are all meshed together to provide a consistent and transparent experience. A 400 player server could actually end up being 10 backend servers, with 40 players each. Thus, it'll run much better.

[–] ramielrowe 0 points 8 months ago* (last edited 8 months ago)

I'm not saying they were purposefully cheating in this or any tournament, and I agree cheating under that context would be totally obvious. But, it is feasible that a pro worried about their stats might be willing to cheat in situations where the stakes are lower outside of tournaments.

What I also don't understand is, if this hacker has lobby wide access, why was it only these two people who got compromised? Why wouldn't the hacker just do the entire lobby? Clearly this hacker loves the clout. Forcing cheats on the entire lobby would certainly be more impressive.

PS. This is all blatant speculation. From all sides. No one, other than the hacker and hopefully Apex really knows what happened. I am mostly frustrated by ACPD's immediate fear mongering of a RCE in EAC or Apex based on no concrete evidence.

[–] ramielrowe 22 points 8 months ago* (last edited 8 months ago) (1 children)

This isn't a statement from Apex or EAC. The original source for the RCE claim is the "Anti-Cheat Police Department" which appears to just be a twitter community. There is absolutely no way Apex would turn over network traffic logs to a twitter community, who knows what kind of sensitive information could be in that. At best, ACPD is taking the players at their word that the cheats magically showed up on their computers.

PS. Apparently there have been multiple RCE vulnerabilities in the Source Engine over the years. So, I’m keeping my mind open.

view more: ‹ prev next ›