Hey, I've came over from Reddit and thought I'll introduce myself as well. As every programmer, I've started way too many pet projects and almost all of them are starving. In terms of framework, I prefer Yii2 over Laravel every day. I feel like Laravel provides you a dozen different (seemingly equally good) ways of doing something. You could say it's lacking clarity or guidance for the developer.
PHP
Welcome to /c/php! This is a community for PHP developers and enthusiasts to share and discuss anything related to PHP. From the latest updates and tutorials, to your burning questions and amazing personal projects, we welcome all contributions.
Let's foster an environment of respect, learning, and mutual growth. Whether you're an experienced PHP developer, a beginner, or just interested in learning more about PHP, we're glad to have you here!
Let's code, learn, and grow together!
Hey, welcome! Classic haha, I have far too many pet projects as well ๐
And yeah agreed, it's a bit dizzying to choose a Laravel "path". Would probably be helpful to have a documentation page sort of like the Remix Stacks where they talk a little bit about which "path" to choose depending on app needs.
Docs is another topic I really don't like about Laravel. Why don't you have a simple API doc with available functions and their parameters instead of that blog-style documentation. And no, I don't want to watch a video about how to use X, I want to know what functions I can call. Oh and don't get me started on all their global "helper" functions.
This is not a comment trying to convert you to Laravel, but if you or anyone else is interested, an API doc actually exists and is available here.
I really appreciate that, thank you. I sometimes have to work with an existing Laravel project at work and this is the kind of documentation I was looking for. Short, to the point, without blabbering.
Hey there! I am new here (migrating from Reddit). Full-stack developer (about 7 years). I hope you're all having a great day.
My name is Ryan Parman. The most popular (public) projects I've created are SimplePie (RSS), and I brought Tarzan/CloudFusion with me to Amazon to create the AWS SDK for PHP. I care a lot about performance, code quality, and standardization.
These days I don't write much PHP anymore (mostly Golang, Python, Bash, and Terraform now), but I love to stay up-to-date with the community and improvements in the language.
Seems between instance, the old comment isn't synced.. ugh..
Can anyone here help me understand persistent PDO connections?
So I set up Patroni cluster, everything works as expected. But during the failover/switchover PHP remembers connections to the failed node. And I can't figure out why. Reloading FPM is kind of PitA.
I will use PgBouncer. But I still would like to know how PHP works with the persistent connections.
I haven't used persistent connections although I have been tempted in the past. I believe, if you haven't used it before, it might come with more trouble than it solves.
As an alternative I could propose using amphp (or maybe react PHP) which will let you handle a pool of connections in a single long running process. But it's a bigger change really, the more I think of it.
Yeah, but any 'single server' PHP defeats the purpose of PHP, so I'm not a huge fan.
I've used them in the past, and they work great, until you have to switchover to another node.
Fair point.
However why do you need persistent connections ? I am thinking that the growing rate of the connections should be very low as the instances increase, given that the queries are quick.
You lose about 5-6ms just connecting to the database. Keeping them open helps a lot.
My goal is to write as simple code as possible, and everything else is super quick, just the connections aren't handled well in PHP. Which is a shame.
It's not that there isn't the option, it's just that I don't know how to help you. MySQL has an option to reconnect, I suppose might be the same for postgres?
The single running process that was so easily dismissed, could save tons of queries, for example! Sorry I keep thinking about that direction
Single process doesn't save any queries, no idea what you mean.
Persistent connections persist between requests just like in a single process. It's just that pool handling is hidden in PDO.
Also how's the setup? You setup for example 5 max children in fpm and 5 persistent connections? Per server? So your overall connections to the db server will be 5x your server instances?
If you setup 5 fpm children and less connections, one child will eventually reuse from another, but only when the connection is free (does not do a query for another process or pdo does not consume a resultset). If it tries to do a query at that time it will have to wait and it will block. This is my understanding. Also how you do transactions with persistent connections?
This has evolved into such an interesting conversation.
From my current understanding, there is no pool, just one process keeps and reuses one database handle over and over again.
And it's not PDO, but the driver, which handles that.
Transactions are handled within try/finally blocks. You can reset the DB connection, but it's not free in terms of time. You get more performance making sure code doesn't leak open transactions.
Also, to work with persistent connections you will have to have a pool right? Because when you query from instance 1, the connection is not available until you consume the result set. Or is that only for MySQL?
Yes, you have a pool, but it's handled by PDO somewhere, and I have no idea how to manipulate it. It just occured to me to try to open another resource before deallocating the first one.
If that doesn't work, I will give PgBouncer a try. In case that won't do what I need, I'll just use pg_pconnect.
I love PHP so much, this is one of two issues I have with it.
You can check with is_resource maybe?
With the single process, you can cache queries in memory depending on how the data change for example and the frequency they have.
The manual https://www.php.net/manual/en/pdo.connections.php
Has some interesting notes and also
https://www.php.net/manual/en/function.pg-connect.php
Mentions a force_new kind of setting if you need. I think not PDO but the constant you might be able to pass.
Also SO has some stuff users say
Personally, I don't try to optimize so hard in PHP (5 to 10ms due to db connection). There is always an improvement on the way things work, like how the code works that would probably give you a magnitude of performance. Just saying!
So I finally found the time, and also my issue - it was a PEBKAC all along.
My retry loop was written so haphazardly, that it was stuck in an infinite loop after experiencing rebalancing, instead of correcting it.
After fixing that, it all works as expected. There was no issue with persistent connections after all. Rebalancing halts the benchmark for 3 seconds, then traffic re-routes itself to the correct node.
The current set-up is three node cluster Postgres + PHP, with HAProxy routing the pg connects to the writeable node, and one nginx load balancer/reverse proxy. I tried PgBouncer with and without persistent PHP connects, and it made little to no measurable difference.
The whole deal is a Proof of Concept for replacing current production project, that is written in Perl+MySQL (PXC). And I dislike both with such burning passion, that I'm willing to use my free time to replace both.
And from my tests it seems that Patroni cluster can fully replace the multi-master cluster with auto-failover, and we can do switch-over without losing a single request.
Glad to hear it. All of it actually. Sounds you are content with it now.
Had to Google PEBKAC. Aren't all problems like that?
That's more of a philosophical question.
My only current issue with PHP is resource handling, and the lack of ability to share sockets between threads/fibers. Erlang actually does this so well, it's a shame that language isn't more used, and so managers are afraid of it.
Back to the question: is the resource sharing an issue with PHP, or is it my PEBKAC that I'm trying to bend PHP to something it wasn't designed for?
At a first level it certainly is an issue with PHP, but PHP was also designed by a human. That design comes with its own problems right? I guess what I said is just a generalisation of PEBKAC as all (mostly all) software is designed by some human. Fact that it's a different chair may as well be considered not a PEBKAC ? Yes it's philosophical or simply which perspective you choose to see.
Haven't played with amphp/parallel but maybe worth a look to see how/if sockets are shared there.
Yeah, for a PHP socket server, the best route would be going React/Amp route, as building a custom event loop is a big undertaking, and anyone will shoot themselves in the foot or worse.
Since they are single process loops, you don't need to 'share' them, as they all belong to the only process. But that does limit you to a single process.
The issue is that the PDO returns a resource, but for a connection, that is no longer writeable.
I did not have time to actually test anything, and I won't till sunday at least.
Just so you get my situation:
I need to benchmark 5000 successful write requests per second. So yes, 5ms is way too long for me, the rest of the request is done within 3ms tops.
I beat that benchmark with ease, the only issue is with the failover in Patroni cluster. Once I get some time to sit down, I will report my findings.
There are many other ways to solve this, I just want to better understand what PDO actually does.
Honestly I don't know much about this but I think this can occur because PHP (or more specifically, the PDO extension) isn't necessarily aware of the health of the PostgreSQL node it's connected to. From PHP's perspective, it opened a connection, and unless explicitly told otherwise, it will assume the connection remains open. The Patroni switchover/failover is happening independently of PHP. PgBouncer is probably a good solution for this.
If I knew how to invalidate failed connection in the persistent pool, that would be enough.
Hello, and thanks for creating this community! Sorry if I make grammar mistakes, I'm not a native speaker.
I am a developer who has been working with PHP for a few years now, but I still consider myself a beginner because I'm the sole developer on my company and I have to learn and do all by myself and take all the development decisions. I'm actually relearning PHP and Laravel following more "formal" courses to cover all the bases that I'm surely lacking and then I'm planning to look for a job when I can work as part of a team to learn from other developers.
Speaking about Laravel, I was looking through the fediverse for a Laravel community, as it was a different subreddit from the PHP one on Reddit. But being a small community for now, maybe it's a good idea to keep everything together while the community starts growing? What do you think?
Solo dev? That's a huge undertaking, especially having to learn by yourself! I think Laravel and Laracasts are excellent starting point to learn "good" PHP and it will definitely open up doors for your next job.
And yes definitely, I think there could be a Laravel specific community!