Emotet

joined 7 months ago
[–] [email protected] 78 points 1 month ago (4 children)

I simply can't wrap my head around the thought process behind launching a clusterfuck like this. Y Combinator probably didn't do their due diligence and simply rode the fading AI Bubble, so I can at least understand how the funding might have been approved.

But actively leaving your $250,000+/year job to team up with some questionable choices to basically fork two OS projects, change the discord links and generate an illegal licence for that shit show, all while proudly stating, publicly, "dawg i chatgpt'd the license, anyone is free to use our app for free for whatever they want. if there's a problem with the license just lmk i'll change it. we busy building rn can't be bothered with legal" when they are made aware of the fact.

This is absolutely insane, sounds like someone was about to get fired and decided to use some personal relations and fresh graduates to somehow successfully cash in one last time with absolutely no regard of even the basics. Pretty wild that those guys even managed to figure out how to found a Startup. Probably asked ChatGPT for instructions there, as well.

[–] [email protected] 26 points 1 month ago (2 children)

Yeah, that's one of those tropes I hate pretty much everywhere, but (old) Star Trek is great enough to look past it.

They are skilled and professional. But how incompetently was the playbook written, if pretty much everyone can come up with something previously not derived spontaneously, if it's that easy?

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

Great points.

Regular solar cells with better efficiency are already are thing, even in a compact travel format or as a novelty part of some electric cars. Those are cheap to produce, but still aren't practical at all, unless we're talking about something like a 2m² solar panel to charge a phone in a somewhat reasonable time on a very sunny day in an off-grid situation.

Using transparent solar cells additionally to regular ones in buildings instead of windows is pretty much the only reasonable application I can think of right now, but with a visible transmittance of 20% that's kinda farfetched as well.

[–] [email protected] 29 points 1 month ago (11 children)

Das stimmt bei richtiger Verwendung schlichtweg nicht und es nützt niemandem, wenn man falsche Informationen herausposaunt. Wie auch im Artikel zu lesen, fand die timing attack auf üblichem Wege, gerade fürs deutsche Rechtssystem aber äußerst kontrovers statt:

Zur finalen Identifikation verpflichtete das Amtsgericht Frankfurt am Main schließlich den Provider Telefónica, unter allen o2-Kundinnen und -Kunden herauszufinden, wer von ihnen sich zu einem der identifizierten Tor-Knoten verband.

Bei einer Timing Attack werden, wie der Name schon sagt, Zugriffszeiten und möglichst viele (Meta-)Daten zu bestimmten Paketen statistisch abgeglichen. So kann man auch ohne direkten Zugriff auf die Daten bei ausreichender Datenlage feststellen, wer mit wem kommuniziert.

Hier wurde schlichtweg jeder o2 Kunde in Deutschland erstmal pauschal überwacht, ob er nicht mit einem bestimmten Server Kontakt aufnimmt. Um dem entgegenzuwirken, kann man natürlich erst einmal über einen (no log) VPN Provider gehen, um gar nicht erst zugeordnet werden zu können.

[–] [email protected] 16 points 2 months ago* (last edited 2 months ago)

Yup. A variation of the quote (basically capitalists instead of American businessmen) is commonly attributed to Lenin instead of Khrushchev. But that, too, can't be verified and is said to be fake.

[–] [email protected] 8 points 2 months ago (1 children)

Gotta be honest, that's a pretty shitty article that suggests that Tesla built a train. They did not.

Tesla funded an employee shuttle to one of their factories by leasing a standard Siemens Miero B.

[–] [email protected] 14 points 2 months ago (2 children)
[–] [email protected] 3 points 2 months ago (3 children)

Buying a domain. There might be some free services that, similar to DuckDNS in the beginning, work reliably for now. But IMHO they are not worth the potential headaches.

[–] [email protected] 2 points 2 months ago (6 children)

DuckDNS pretty often has problems and fails to propagate properly. It's not very good, especially with frequent IP changes.

[–] [email protected] 20 points 2 months ago (1 children)

Random guy with no affiliation to crypto and only a vague understanding of monero from another instance here, who saw the post on /all.

Most people stumbling over posts like this probably see yet another shady cryptocurrency and aren't interested or even actively dislike it, resulting in downvotes. Calling people "grudgeful bitfags" and "overly-sensitive leftist fediverse dwellers" probably doesn't help all that much either, neither do comments that attribute a general disinterest to a "very successful psyop by the CIA to make crypto look like a scam".

[–] [email protected] 2 points 2 months ago

Damn, that's wild. Cheers for sharing!

 

I'm strongly considering adding another backup location in the form of an old Raspberry Pi and a USB HDD.

I want the Pi to exclusively use the available network to connect to my Wireguard Server, so other devices (local to the Wireguard Server and remote connected to the server) can use it as a secondary backup location.

I'm kind of worried about a scenario, where my network is compromised and, through the VPN connection of the Pi in the external network, the external network is as well.

What are the best practices to secure such a setup?

 

Currently, I have two VPN clients on most of my devices:

  • One for connecting to a LAN
  • One commercial VPN for privacy reasons

I usually stay connected to the commercial VPN on all my devices, unless I need to access something on that LAN.

This setup has a few drawbacks:

  • Most commercial VPN providers have a limit on the number of simulations connected clients
  • I either obfuscate my IP or am able to access resources on that LAN, including my Pi-Hole fur custom DNS-based blocking

One possible solution for this would be to route all internet traffic through a VPN client on the router in the LAN and figuring out how to still be able to at least have a port open for the VPN docker container allowing access to the LAN. But then the ability to split tunnel around that would be pretty hard to achieve.

I want to be able to connect to a VPN host container on the LAN, which in turn routes all internet traffic through another VPN client container while allowing LAN traffic, but still be able to split tunnel specific applications on my Android/Linux/iOS devices.

Basically this:

   +---------------------+ internet traffic   +--------------------+           
   |                     | remote LAN traffic |                    |           
   | Client              |------------------->|VPN Host Container  |           
   | (Android/iOS/Linux) |                    |in remote LAN       |           
   |                     |                    |                    |           
   +---------------------+                    +--------------------+           
                      |                         |     |                        
                      |       remote LAN traffic|     | internet traffic       
split tunneled traffic|                 |--------     |                        
                      |                 |             v                        
                      v                 |         +---------------------------+
  +---------------------+               v         |                           |
  | regular LAN or      |     +-----------+       | VPN Client Container      |
  | internet connection |     |remote LAN |       | connects to commercial VPN|
  +---------------------+     +-----------+       |                           |
                                                  |                           |
                                                  +---------------------------+

Any recommendations on how to achieve this, especially considering client apps for Android and iOS with the ability to split tunnel per application?

Update:

~~Got it by following this guide.~~

Ended up modifying this setup to have better control over potential IP leakage

 

@[email protected] und ich haben mehr oder weniger unabhängig zwei Web Apps entwickelt , welche beide bis auf gewisse Extrafeatures den Accountumzug so einfach wie möglich gestalten sollten:

https://stablenarwhal.github.io/Lemmy-Userdata-Migration/

Features:

  • Don't trust me or GitHub? Clone the project and host it yourself or run it locally (Example in Wiki)
  • Export user data from any Lemmy instance (>=v0.19)
  • Download user data as a text file
  • Modify user data, e.g. to add or remove followed users/communites (Example in Wiki)
    • "display_name" ​
    • "bio" ​
    • "avatar" ​
    • "banner" ​
    • "matrix_id" ​
    • "bot_account" ​
    • "settings" ​
    • "followed_communities" ​
    • "saved_posts" ​
    • "saved_comments" ​
    • "blocked_communities" ​
    • "blocked_users" ​
    • "blocked_instances"
  • Transfer user data to the target account on the target instance

https://elvith-de.github.io/lemmy-migration/

Features:

  • Login and export settings from any Lemmy instance (e.g. feddit.de)
  • Optionally: Find local communities on the target instance that match followed communities
  • Optionally: Backup your settings to a file (can be imported on any Lemmy instance in your profile)
  • Login and import settings to any Lemmy instance (e.g. feddit.org)
 

@[email protected] und ich haben mehr oder weniger unabhängig zwei Web Apps entwickelt , welche beide bis auf gewisse Extrafeatures den Accountumzug so einfach wie möglich gestalten sollten:

https://stablenarwhal.github.io/Lemmy-Userdata-Migration/

Features:

  • Export user data from any Lemmy instance (>=v0.19)
  • Download user data as a text file
  • Modify user data, e.g. to add or remove followed users/communites (Example in Wiki)
    • "display_name" ​
    • "bio" ​
    • "avatar" ​
    • "banner" ​
    • "matrix_id" ​
    • "bot_account" ​
    • "settings" ​
    • "followed_communities" ​
    • "saved_posts" ​
    • "saved_comments" ​
    • "blocked_communities" ​
    • "blocked_users" ​
    • "blocked_instances"
  • Transfer user data to the target account on the target instance

https://elvith-de.github.io/lemmy-migration/

Features:

  • Login and export settings from any Lemmy instance (e.g. feddit.de)
  • Optionally: Find local communities on the target instance that match followed communities
  • Optionally: Backup your settings to a file (can be imported on any Lemmy instance in your profile)
  • Login and import settings to any Lemmy instance (e.g. feddit.org)
 

cross-posted from: https://slrpnk.net/post/10823519

So I wrote a little web app that allows a user to move their user data, like settings and subscribed/banned communities, from one account/instance to another.

It runs completely client-side, but is hosted on GitHub for the moment. Maybe it'll be of some use!

Features:

  • Don't trust me or GitHub? Clone the project and host it yourself or run it locally (Example in Wiki)
  • Export user data from any Lemmy instance (>=v0.19)
  • Download user data as a text file
  • Modify user data, e.g. to add or remove followed users/communites (Example in Wiki)
    • "display_name" ​
    • "bio" ​
    • "avatar" ​
    • "banner" ​
    • "matrix_id" ​
    • "bot_account" ​
    • "settings" ​
    • "followed_communities" ​
    • "saved_posts" ​
    • "saved_comments" ​
    • "blocked_communities" ​
    • "blocked_users" ​
    • "blocked_instances"
  • Transfer user data to the target account on the target instance
 

So I wrote a little web app that allows a user to move their user data, like settings and subscribed/banned communities, from one account/instance to another.

It runs completely client-side, but is hosted on GitHub for the moment. Maybe it'll be of some use!

Features:

  • Export user data from any Lemmy instance (>=v0.19)
  • Download user data as a text file
  • Modify user data in the browser, e.g. to add or remove followed instances
  • Transfer user data to the target account on the target instance
 

An dieser Stelle reposte ich auch auf der neuen deutschen Main-Instanz zwei einfache Wege, um seinen User (Settings und abonnierte/geblockte Communities) von einer Lemmy Instanz auf eine andere umzuziehen, beispielsweise von feddit.de auf feddit.org, von meinem ursprünglichen Post unter feddit.de/c/main ( https://alexandrite.app/feddit.de/post/11325409)

Update: Ich habe für die erhöhte Userfreundlichkeit noch eine Web Applikation erstellt, welche den Prozess so einfach wie möglich gestalten soll. Zu finden hier:

https://stablenarwhal.github.io/Lemmy-Userdata-Migration/

Features:

  • Export user data from any Lemmy instance (>=v0.19)
  • Download user data as a text file
  • Modify user data in the browser, e.g. to add or remove followed instances
  • Transfer user data to the target account on the target instance

Update 2: @[email protected] hat auch eine Web Applikation mit ähnlicher Funktionalität entwickelt. Zu finden hier:

https://elvith-de.github.io/lemmy-migration/

Features:

  • Login and export settings from any Lemmy instance (e.g. feddit.de)
  • Optionally: Find local communities on the target instance that match followed communities
  • Optionally: Backup your settings to a file (can be imported on any Lemmy instance in your profile)
  • Login and import settings to any Lemmy instance (e.g. feddit.org)

Weg 1, falls man noch einen Browser mit aktiver Session auf feddit.de hat:

Lemmy bietet seit Version 0.19 eine Funktion an, um die user data zu ex- und importieren. Das geht normalerweise über einen Button in den Settings des Webinterfaces, das geht aktuell bei feddit.de nicht.

Aber der zugrundeliegende API-Aufruf funktioniert noch, solange man noch mit einem Browser auf feddit.de eingeloggt ist:

  1. Man gehe auf https://feddit.de/api/v3/user/export_settings und speichert die zurückgegebene Datei als irgendwas.json
  2. Man nehme einen (neuen) Account auf einer stabilen Instanz der Wahl, gehe auf /settings und lade irgendwas.json über den Import-Button hoch.
  3. Voilà, man genieße die neue Instanz.

Das funktioniert mit jeder Instanz >=0.19, man muss lediglich das "feddit.de" in der URL ersetzen. Und wenn das Webinterface funktioniert, geht das auch über den Export- Button in den Settings.


Weg 2:

Für die Leute, die keine offene Browser Session haben, hier ein kleines, aber funktionales Bash Script, welches im Ausführungsverzeichnis eine myFedditUserData.json erstellt, welche bei anderen Instanzen importiert werden kann.

Anforderungen:

  • Linux/Mac OS X /Windows mit WSL
  • jq installiert (Unter Ubuntu/Debian/Mint z.B. per sudo apt install -y jq

Anleitung:

  • Folgendes Script unter einem beliebigen Namen mit .sh Endung abspeichern, z.B. getMyFedditUserData.sh
  • Script in beliebigen Textprogramm öffnen, Username/Mail und Passwort ausfüllen (optional Instanz ändern)
  • Terminal im Ordner des Scripts öffnen und chmod +x getMyFedditUserData.sh ausführen (Namen eventuell anpassen)
  • ./getMyFedditUserData.sh im Terminal eingeben
  • Nun liegt im Ordner neben dem Script eine frische myFedditUserData.json

Anmerkung: Das Script ist recht simpel, es wird ein JWT Bearer Token angefragt und als Header bei dem GET Aufruf von https://feddit.de/api/v3/user/export_settings mitgegeben. Wer kein Linux/Mac OS X zur Verfügung hat, kann den Ablauf mit anderen Mitteln nachstellen.

Das Script:

#!/bin/bash

# Basic login script for Lemmy API

# CHANGE THESE VALUES
my_instance="https://feddit.de"			# e.g. https://feddit.nl
my_username=""			# e.g. freamon
my_password=""			# e.g. hunter2

########################################################

# Lemmy API version
API="api/v3"

########################################################

# Turn off history substitution (avoid errors with ! usage)
set +H

########################################################

# Login
login() {
	end_point="user/login"
	json_data="{\"username_or_email\":\"$my_username\",\"password\":\"$my_password\"}"

	url="$my_instance/$API/$end_point"

	curl -H "Content-Type: application/json" -d "$json_data" "$url"
}

# Get userdata as JSON
getUserData() {
	end_point="user/export_settings"

	url="$my_instance/$API/$end_point"

	curl -H "Authorization: Bearer ${JWT}" "$url"
}

JWT=$(login | jq -r '.jwt')

printf 'JWT Token: %s\n' "$JWT"

getUserData | jq > myFedditUserData.json

@[email protected] hat mein Script auch in PowerShell nachgebaut, welches unter Windows ohne WSL auskommt: https://gist.github.com/elvith-de/89107061661e001df659d7a7d413092b

# CHANGE THESE VALUES
$my_instance="https://feddit.de" # e.g. https://feddit.nl
$target_file = "C:\Temp\export.json"

########################################################
#Ask user for username and password
$credentials = Get-Credential -Message "Logindata for $my_instance" -Title "Login"

$my_username= $credentials.UserName
$my_password= $credentials.GetNetworkCredential().Password

# Lemmy API version
$API="api/v3"

# Login
function Get-AuthToken() {
    $end_point="user/login"
    $json_data= @{
        "username_or_email" = $my_username;
        "password" = $my_password
    } | ConvertTo-Json

    $url="$my_instance/$API/$end_point"

    (Invoke-RestMethod -Headers @{"Content-Type" = "application/json"} -Body $json_data -Method Post -Uri $url).JWT
}

# Get userdata as JSON
function Get-UserData() {
    $end_point="user/export_settings"

    $url="$my_instance/$API/$end_point"

    Invoke-RestMethod -Headers @{"Authorization"="Bearer $($JWT)"} -Method Get -Uri $url
}

$JWT= Get-AuthToken

Write-Host "Got JWT Token: $JWT"

Write-Host "Exporting data to $target_file"
Get-UserData | ConvertTo-Json | Out-File -FilePath $target_file
 
  1. Install the Userscripts extension for Safari, open the app and go through the setup as instructed by the app. Don't forget to activate the extension for Safari.
  2. After fully setting up the Userscripts extension, go to the TwitchAdSolutions git repo and click on userscript next to video-swap-new. You can play around with vaft as well, but video-swap-new works way better on iOS in my experience. This script replaces ads with a lower resolution stream.
  3. Install the script by opening the Userscripts extension in Safari while the script you opened in step 2 is the active tab and clicking the Install button.
  4. Go to twitch.tv and enjoy your ad-free experience.

Optional:

  • When you have the Twitch App installed, Safari displays the annoying "open in App" Bar on top of the website. This also leaks into Fullscreen Mode. To get rid of it, uninstall the Twitch App and optionally install another extension to remove those Bars altogether, like Unsmartifier.
  • Want 7TV/BTTV/FFZ features like Emotes or a customizable Twitch Experience? Install the FFZ userscript, reload twitch.tv and configure your experience and/or install Add-ons like 7TV Emotes by clicking the new icon top right.
 

Using reddit without an account is a pain nowadays, especially with any commercial VPN. There are ways around that:

Some of you may known the rather short-lived Libreddit, an awesome frontend for Reddit that got struck down by its success.

Redlib is a (still working) fork of Libreddit with a few instances. Due to reddits API Limits, it's not very practical to rely on one instance.

A quite elegant solution is the Automatic Redlib Quota & Error Redirector userscript. Once installed, most Redlib errors are automatically detected and your request gets redirected to another instance. This results in an excellent user experience, altough some instances can be a bit slow if you're trying to access media.

The list of available Redlib instances the script uses gets updated quite frequently. The script also works nicely with redirect plugins, e.g. this one for Firefox, if you want to automatically redirect all Reddit URLs to Redlib.

view more: next ›