Blame all the companies with ridiculously high requirements just to hire people who don't meet all of them. It's a common advice to apply even when you don't meet all the reqs, because it works out so often.
bill_1992
This whole thing is basically a nonstory when you realize how much money is in tech. Meta changed their name and sank billions on an idea that everyone thought was stupid from the beginning, and they're still fine.
Putting a billion into the flavor-of-the-month that has like 10% chance to be the next big thing is a no-brainer when you're printing multiple billions in profit doing nothing, and have a lot more cash on hand.
The real story, is how wealth inequality and monopolies have essentially allowed the rich to waste tons of money chasing more wealth while having almost no incentive to provide value to society. Who gives a fuck about hallucination and prompt injection? It's all trivial details that VCs are giving away billions to eventually solve.
The point about a binary protocol is interesting, because it would inherently solve the injection issue.
However, constructing an ad-hoc query becomes tedious, as you're now dealing with bytes and text together. Doing so in a terminal can be pretty tedious, and most people would require a tool to do so. Compare this against SQL, where you can easily build a query in your terminal. I think the tradeoff is similar to protobuf vs json.
You could do a text representation (like textproto), but guess what? Now injection is an issue again.
Another thing would be the complexity of client libraries. With SQL client libraries, the library doesn't need to parse or know SQL - it can send off the prepared statement as-is. With a binary protocol, the client libraries will likely need to include a query builder that builds the byte representation since no developers are going to be concatenating bytes by hand, which makes the bar higher for open-source libraries. This also means that if you add a new query feature to your DB, all client libraries will likely need to be updated to use the feature.
And you're still going to need to tune and optimize queries for this new DB. That's just the nature of the beast: scaling is hard especially when you can't throw money at the problem.
Quite frankly, it's a lot of hard tradeoffs to not need to use prepared statements or query builders. Injection is still is an issue for SQL today, but it's been "solved" as much as it possibly can.
I've been using Jooq to build my queries (and run them). Beats the hell out of writing prepared statements in strings.
Not sure what power I'm missing though, I've been able to do everything via Jooq that I want to do.
Everyone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
Quite frankly as an American, I think it's very American to even consider the timetable as out of your control. For a lot of places, the trains come so fast that you're not even waiting for a few minutes - like most drivers take longer to get settled into their car seat before driving. The sorry state of American transit is absolutely not the pinnacle of transit.
Also, trains and light rail have already been automated. The tech is already here.
Is it just me or is the article super misleading? None of the roles are for generative AI for making movies. It looks like the roles are for either research or generic product personalization stuff, none of which is necessarily generative AI. I'm not quite sure why they juxtaposed those AI roles with the ongoing strikes in Hollywood, because they have nothing to do with each other.
Quite frankly, I think the current crop of AI products have yet to take away from the real creative process.
I had a mixed experience adding types to a large enterprise Python codebase.
I think the thing that really kills it is the (relative) lack of community support. Whereas with TS, almost every package big or small usually has types, I've found a lot of pip packages wouldn't be typed out of the box, which means you gotta generate them automatically or use escape hatches like Any.
Using escape hatches like Any basically kill the point of typing, as the static checker basically stops checking after it sees an Any. If your static checker is configured to ignore certain files because they aren't typed yet, then any code that refers to those files also get ignored. You basically need to hit a threshold of your codebase and dependencies to get the benefits of typing. Until then, my experience was finding bugs that the type checker should've caught but didn't.
And obviously, to get the full power of types, you must buy in as a team, and that means really buy-in, without resorting to escape hatches like Any. Any reluctance, and you're likely in for an uphill battle.
Another thing that really hurt adoption, was that before using typing, a lot of the code just clearly broke type rules, eg a function that returns a string or a number, but the caller assumes the output is a number. Especially if it's lower level code, those may take a nontrivial refactor to fix.
All of this is assuming it's trivial to enforce a static check on the codebase through CI/CD.
This leads to my conclusion, that not being forced to use types is a BENEFIT of Python, not a downside. You are able to write code a lot faster and more expressively if you don't need to worry about typing, for small scripts or whatnot. I think if you're starting a project of any size and already know you want typing, consider using another language that has typing built in.
That's Silicon Valley's MO. Just half a year ago, people were putting crypto BS in their products.
Good stuff, but really wish it didn't come to this.
As the Mercury News states, it's really not a long-term solution to find state-owned abandoned land to build. Ideally, it doesn't take state intervention at all to get new housing started, the initiative should come from the city governments themselves. But it doesn't and it likely won't for a while.
The problem is the job market has basically priced in exaggerations on resumes. People exaggerate all the time and don't get punished for it.
If you don't exaggerate, you may even miss out on opportunities and hamper your career goals whatever they may be, because they already assume you exaggerate and already account for it when reading your resume. And if you don't exaggerate? Well, they're happy to pay you less than they would've.
Certainly at least in tech in the Bay Area, fake it till you make it is the norm. I've met plenty of people with amazing resumes and references just to see them not be as good as advertised.