So another talk I'm looking forward to seeing this year at Northeast PHP is 'Agile in the Workplace', to be given by Michael Stowe (more details at http://northeastphp.org/talks/view/57/Agile-in-the-Workplace and https://joind.in/talk/view/8914).
To understand why I'm going to find this talk interesting, you have to know that I tend to be one of those people that learns way more through practical examples than I ever do through theory.
The concepts intrigue me, however all the reading in the world is not going to make it all fall together without real life exposure. Unfortunately that would mean finding and working with a group that practices agile correctly... which is not really an option for me. So instead, I'll take the second best thing: I'm going to listen to someone talking about it, and no doubt listen to stories about practical examples.
I'm looking forward to hearing the good and the bad about the process, and ways that I can avoid following those bad practices. For what it's worth, the reason I highlighted the word 'correctly' above is because from what I understand, 'agile' can very easily be a word that gets thrown around by groups, without actually being agile. Every story I've heard about these groups tends to imply that there is a very painful lesson to be learned about the difference between being agile and thinking you're being agile.
Hopefully, too, there may be a way to correctly ease a team into an agile workflow, which I believe would help everyone by being an evolution as opposed to a revolution.
Either way, I say 'viva la agile'. Or something like that... My French is not so good.
Thursday, July 25, 2013
Tuesday, July 23, 2013
Could Second Item Auctions be Used for Ticket Sales?
Every time a big concert, game, show or whatnot goes on sale, we invariably read people complaining about scalpers taking all of the tickets and then jacking up the prices.
Firstly, let me try and summarise my take on both sides of the argument. The anti-scalper argument is that because scalpers are buying as many tickets as possible, and because their business revolves around the fact that they can get these tickets, it means that the scalpers are going to always try and be first to buy, and have a financial incentive to do so. That is to say, the fans have less of a chance to buy the tickets.
The pro-scalper argument is that basically it's the free market, baby! When scalpers can buy and sell tickets for a profit, then that indicates that the original ticket price was 'wrong', in that the sellers were leaving money on the table so to speak.
So with that being my understanding of the problem, I ask: what would happen if tickets were sold off in the form of a second item auction? A second item auction is when you are auctioning more than one identical item, and basically the winners are those people that bid the most... with the 'twist' being that all winners only pay the lowest winning bid.
For what it's worth, I always thought this was called a Dutch auction, but that link to Wikipedia actually says that a second item auction can be confused with a Dutch auction, so I don't feel so bad...
So off the bat, this sounds like a horrible idea: in theory the scalpers are cut out of the process, as those people that are willing to spend big will do that directly with the seller. The fans are probably screwed out of the process too, though, as they are likely going to be outbid.
Over time, though, I wonder if the prices would fall, as there is going to be a group of people that are bidding higher than they would like, just to make sure they get to go to the show? In my head, at least, I think that as this goes on for a little bit and scalpers are essentially squeezed out of the market, these people will find that they can slowly lower their bids and still get tickets.
Either way, it would definitely remove a lot of the incentive for the scalpers, and would even remove the argument for the existence of them: the whole 'the price is not what the market is willing to bear'.
And who knows? Maybe the other fans will be happier, just knowing that there isn't someone there, buying tickets from under them simply to make money. Or not...
Firstly, let me try and summarise my take on both sides of the argument. The anti-scalper argument is that because scalpers are buying as many tickets as possible, and because their business revolves around the fact that they can get these tickets, it means that the scalpers are going to always try and be first to buy, and have a financial incentive to do so. That is to say, the fans have less of a chance to buy the tickets.
The pro-scalper argument is that basically it's the free market, baby! When scalpers can buy and sell tickets for a profit, then that indicates that the original ticket price was 'wrong', in that the sellers were leaving money on the table so to speak.
So with that being my understanding of the problem, I ask: what would happen if tickets were sold off in the form of a second item auction? A second item auction is when you are auctioning more than one identical item, and basically the winners are those people that bid the most... with the 'twist' being that all winners only pay the lowest winning bid.
For what it's worth, I always thought this was called a Dutch auction, but that link to Wikipedia actually says that a second item auction can be confused with a Dutch auction, so I don't feel so bad...
So off the bat, this sounds like a horrible idea: in theory the scalpers are cut out of the process, as those people that are willing to spend big will do that directly with the seller. The fans are probably screwed out of the process too, though, as they are likely going to be outbid.
Over time, though, I wonder if the prices would fall, as there is going to be a group of people that are bidding higher than they would like, just to make sure they get to go to the show? In my head, at least, I think that as this goes on for a little bit and scalpers are essentially squeezed out of the market, these people will find that they can slowly lower their bids and still get tickets.
Either way, it would definitely remove a lot of the incentive for the scalpers, and would even remove the argument for the existence of them: the whole 'the price is not what the market is willing to bear'.
And who knows? Maybe the other fans will be happier, just knowing that there isn't someone there, buying tickets from under them simply to make money. Or not...
Monday, July 22, 2013
NEPHP 2013 Talks: How To Get There
Just another post, keeping hungry for the Northeast PHP conference this coming August in Boston...
What I could only assume is due to the alignment of a number of stars, I see that Larry Ullman is giving a talk titled 'How To Get There' (more details at http://northeastphp.org/talks/view/148/How-To-Get-There and https://joind.in/talk/view/8911).
Lately I have been feeling more and more like I'm not at my peak. I should be better, bigger and more valuable than I am. This is hopefully going to be one of those talks that slaps me in the face and gets me to wake up and push through.
The funny thing about these motivational talks, I find, is the fact that simply by me acknowledging that I want to see this talk pushes me further. Which is awesome.
Even still, I really can't way to see Larry talk. Along with his other talks 'Ajax: You Can Do It Too' and 'Teaching PHP & Web Development' I foresee a day when Larry may have his own track at NEPHP and to be honest, I'd probably end up seeing all of it!
What I could only assume is due to the alignment of a number of stars, I see that Larry Ullman is giving a talk titled 'How To Get There' (more details at http://northeastphp.org/talks/view/148/How-To-Get-There and https://joind.in/talk/view/8911).
Lately I have been feeling more and more like I'm not at my peak. I should be better, bigger and more valuable than I am. This is hopefully going to be one of those talks that slaps me in the face and gets me to wake up and push through.
The funny thing about these motivational talks, I find, is the fact that simply by me acknowledging that I want to see this talk pushes me further. Which is awesome.
Even still, I really can't way to see Larry talk. Along with his other talks 'Ajax: You Can Do It Too' and 'Teaching PHP & Web Development' I foresee a day when Larry may have his own track at NEPHP and to be honest, I'd probably end up seeing all of it!
Thursday, July 18, 2013
NEPHP 2013 Talks: Package Management in PHP
The Northeast PHP Conference is coming up, and to keep myself motivated and on the edge of my seat, I figured I might write about some of the talks and workshops I'm looking forward to...
One of the talks that I'm really excited to see at NEPHP is titled "Package Management in PHP: Better Late than Never!" (you can see it at http://northeastphp.org/talks/view/21/Package-Management-in-PHP-Better-Late-than-Never and https://joind.in/talk/view/8903).
To be honest, I don't think I have to say much on this at all. As far as I'm concerned, if you don't understand the premise then that's a great sign that you have something interesting to look into. If you do understand it and don't care, then I believe that you're on the wrong path.
The description itself explains the crux of the issue: a lot of current (and some not so current) languages already have similar tools for managing the packages that you use in your code, and it's great to see some advancements in this field for PHP.
Don't get me wrong: PEAR was a pretty good start, but it seems as though every attempt at improving that process seemed to fail, until now with Composer and Packagist.
A simple command line tool that lets you easily dictate the required versions of libraries and then takes care of grabbing them (and their dependencies!), storing them in your project and even managing upgrades? Count me in!
While on the one hand we have Composer, that manages these packages, it has to get them from somewhere, and while it can get them from practically anywhere, we look to the other hand and we find Packagist. Packagist is quickly becoming the 'go to' place for these libraries. It's almost like the new PEAR site, or Perl's CPAN. So put these two together and suddenly we have an easy way to find packages, and then an easy way to manage them in our own projects.
It's becoming more and more obvious that we will be using more and more of these focussed packages for single purposes as time goes by, and keeping track of all of that will be a nightmare without something like this. If you're not at least trying to use these packages, then let's face it: you're probably wasting time re-inventing the wheel.
As I mentioned earlier, Composer really is picking up steam and I'm loving the fact that frameworks such as Symfony 2 (and it's little brother Silex) are using it as the go to system for managing their own packages.
I hope I sound as excited as I actually am for this stuff: it's an awesome development for successful code reuse that's spreading across entirely separate projects and all of a sudden PHP developers are OK with reaching outside of their own source tree to find something that works well and works now.
Hat's off to the Composer and Packagist teams, and I can't wait to see Sequoia McDowell's talk at Northeast PHP this year.
Hopefully I'll see you there too!
One of the talks that I'm really excited to see at NEPHP is titled "Package Management in PHP: Better Late than Never!" (you can see it at http://northeastphp.org/talks/view/21/Package-Management-in-PHP-Better-Late-than-Never and https://joind.in/talk/view/8903).
To be honest, I don't think I have to say much on this at all. As far as I'm concerned, if you don't understand the premise then that's a great sign that you have something interesting to look into. If you do understand it and don't care, then I believe that you're on the wrong path.
The description itself explains the crux of the issue: a lot of current (and some not so current) languages already have similar tools for managing the packages that you use in your code, and it's great to see some advancements in this field for PHP.
Don't get me wrong: PEAR was a pretty good start, but it seems as though every attempt at improving that process seemed to fail, until now with Composer and Packagist.
A simple command line tool that lets you easily dictate the required versions of libraries and then takes care of grabbing them (and their dependencies!), storing them in your project and even managing upgrades? Count me in!
While on the one hand we have Composer, that manages these packages, it has to get them from somewhere, and while it can get them from practically anywhere, we look to the other hand and we find Packagist. Packagist is quickly becoming the 'go to' place for these libraries. It's almost like the new PEAR site, or Perl's CPAN. So put these two together and suddenly we have an easy way to find packages, and then an easy way to manage them in our own projects.
It's becoming more and more obvious that we will be using more and more of these focussed packages for single purposes as time goes by, and keeping track of all of that will be a nightmare without something like this. If you're not at least trying to use these packages, then let's face it: you're probably wasting time re-inventing the wheel.
As I mentioned earlier, Composer really is picking up steam and I'm loving the fact that frameworks such as Symfony 2 (and it's little brother Silex) are using it as the go to system for managing their own packages.
I hope I sound as excited as I actually am for this stuff: it's an awesome development for successful code reuse that's spreading across entirely separate projects and all of a sudden PHP developers are OK with reaching outside of their own source tree to find something that works well and works now.
Hat's off to the Composer and Packagist teams, and I can't wait to see Sequoia McDowell's talk at Northeast PHP this year.
Hopefully I'll see you there too!
Wednesday, July 3, 2013
My Take on the Marshmallow Experiment
I just finished reading an article that apparently explains "What Marshmallows Tell Us About Silicon Valley". It's a take on the classic Stanford Marshmallow Experiment.
On and off, I have considered what I would have done if I were in that situation... and what would I do if I were in that situation now?
I honestly believe that it doesn't matter when it took place, I am 99% certain that if the 'marshmallows' offered were basically of equal value to me, I would eat the first one then and there.
Apparently that implies that I have little patience and not much self-control. While that is true to an extent (I mean, everyone can point to moments in their lives when that is true), I don't believe that is the reason for me eating the marshmallows straight away.
For me, it's a lot simpler: firstly, I don't really care that much for 'more candy' and as far as I can remember, I never have. For me, a second marshmallow in 15 minutes time just seems like a stupid thing to wait for... but then that may be what the experiment proves. Secondly, however, is something that I think more defines why I would not wait.
I was raised to believe that expecting a 'host' to have to do more work for me is unacceptable (where in this case, the host was the experimenter). As far as I'm concerned, a host provides their guests with a venue to facilitate a good time. Although they probably will have food, drink, music or whatever, I've never been to someone's home and then complained after leaving "man, they could have at least offered me a coffee!"
When I was a boy and my friends would come over, my parents made sure that they all knew to "make yourself at home". That means to feel comfortable and if you want something to eat or drink and it's not been offered, that doesn't mean that you have to go without. As my parents would say "you're a big boy... use your legs!".
On a tangent now, but I don't want to give the impression that our home was "that place" where everyone just raised themselves. Quite the opposite, actually. "Make yourself at home" meant "you're part of the family", not "treat this place as the place you live in". Therefore, conversely, if you didn't want to be part of our extended family, that was fine... just don't expect us to offer you the same courtesies. That meant some interesting interactions between my friends and my parents at times... but at least everyone knew where everyone stood!
Anyway, back to the marshmallows. If I am a guest somewhere and I am offered something, with the option of more later, then for me it would just be rude to expect my host to then have to go out of their way to organise the extra stuff. If it was already prepared and they actually wanted me to have it, they would have offered it to begin with. Anything other than that and they were obviously just being nice and I would obviously not want to put them out. If I wanted another marshmallow, surely I should get one myself at a time and place that's more convenient to everyone.
So, long story short: as far as I'm concerned, the reason I would take the marshmallow today has only a little to do with impulse control, and mainly all to do with the fact that I would consider it rude to have the host have to get me something else later on. Instead, stop worrying about me, sit down and have one yourself!
Oh yeah, and you could at least offer me a coffee :)
Monday, June 24, 2013
Northeast PHP Conference 2013
This is me, telling anyone who cares that the Northeast PHP Conference is set to run again this year from August 16-18.
This year is a little different, though, in that as well as the two days of talks, there's also a day of workshops. The line up for speakers is looking awesome and as a way of testing my typing fingers, I'll probably start writing about which talks I'm most interested in... just to keep myself excited :)
The thing I love about this conference is that it has a focus on inclusion: the talks are ranging from beginners to advanced, and there is an entire track dedicated to User Experience, which I think is sorely lacking in our industry.
If anyone else is going to be in Boston for the conference let me know, as I think it'll be an awesome opportunity to catch up or meet for the first time!
This year is a little different, though, in that as well as the two days of talks, there's also a day of workshops. The line up for speakers is looking awesome and as a way of testing my typing fingers, I'll probably start writing about which talks I'm most interested in... just to keep myself excited :)
The thing I love about this conference is that it has a focus on inclusion: the talks are ranging from beginners to advanced, and there is an entire track dedicated to User Experience, which I think is sorely lacking in our industry.
If anyone else is going to be in Boston for the conference let me know, as I think it'll be an awesome opportunity to catch up or meet for the first time!
Monday, June 10, 2013
Backing up from Plesk to S3
Recently I went looking for a solution for backing up from a Plesk server to S3: what I settled on was surprisingly simple.
I started with a simple list of criteria, but as I went looking for a solution, and as I continued to find no good ones, my list got longer. I have a tendency to be quite OK with the bare bones if I'm going to be using an existing system, but if I have to build it myself, I'm normally happy to add more features.
I started with basically "I want to be able to backup files and databases", but the solution I ended up with also gave me the following features:
- backup multiple domains
- along with files and databases, backup the actual domain configuration and mail if required
- rotate backups automatically
- define the frequency of backups and the number of backups to keep before rotating
- open source
- a simple interface right in Plesk
So what did I do?
I don't know, really, if this is a super smart way to do it, or just a cop-out, but basically I realised that hey, the Plesk backup manager already let's us do all of the above... except for the S3 part. All that I ended up doing was installing s3cmd from http://s3tools.org and setting it up to do the syncing to S3, looking at the location on the server that Plesk puts it backups.
So basically, users (or I) define backup rules for each domain as needed (via the Plesk UI) and then s3cmd runs with the sync option once a day.
With s3cmd located in my /root/cli-tools directory, and assuming s3://example.com is the name of the bucket I will use, the actual cron tab entry I use is as simple as:
cd /var/lib/psa/dumps; /root/cli-tools/s3/s3cmd -c /root/cli-tools/s3/s3cfg --delete-removed -H --no-progress sync domains s3://example.com/backups
UPDATE: as per a comment by Rutger below, you may actually want to use:
cd /var/lib/psa/dumps; /root/cli-tools/s3/s3cmd -c /root/cli-tools/s3/s3cfg --delete-removed -H --no-progress sync clients s3://example.com/backups
instead of, or possibly in association with, the above line.
When I commission a new Plesk server, I just copy the s3cmd directory over, create a new bucket and I'm done.
The only downsides I see, really, are that if I wanted to just have a single rule for all of the domains, I couldn't. Also, I'm assuming that all of the backups have been run when the cron job runs once a day. Not that that matters too much, as I could just bump the cron job up to hourly if I liked and I wouldn't see much difference.
I think the biggest negative to this approach is that I'm pushing backups explicitly even if what has been backed up actually hasn't changed. That is to say, if Plesk does a backup everyday, then I push a new backup every day... even if nothing has changed since the last backup.
Anyway, I hope this helps someone as for me it was completely obvious once I realised it, but it took me an embarrassingly long time to get to it.
I started with a simple list of criteria, but as I went looking for a solution, and as I continued to find no good ones, my list got longer. I have a tendency to be quite OK with the bare bones if I'm going to be using an existing system, but if I have to build it myself, I'm normally happy to add more features.
I started with basically "I want to be able to backup files and databases", but the solution I ended up with also gave me the following features:
- backup multiple domains
- along with files and databases, backup the actual domain configuration and mail if required
- rotate backups automatically
- define the frequency of backups and the number of backups to keep before rotating
- open source
- a simple interface right in Plesk
So what did I do?
I don't know, really, if this is a super smart way to do it, or just a cop-out, but basically I realised that hey, the Plesk backup manager already let's us do all of the above... except for the S3 part. All that I ended up doing was installing s3cmd from http://s3tools.org and setting it up to do the syncing to S3, looking at the location on the server that Plesk puts it backups.
So basically, users (or I) define backup rules for each domain as needed (via the Plesk UI) and then s3cmd runs with the sync option once a day.
With s3cmd located in my /root/cli-tools directory, and assuming s3://example.com is the name of the bucket I will use, the actual cron tab entry I use is as simple as:
cd /var/lib/psa/dumps; /root/cli-tools/s3/s3cmd -c /root/cli-tools/s3/s3cfg --delete-removed -H --no-progress sync domains s3://example.com/backups
UPDATE: as per a comment by Rutger below, you may actually want to use:
cd /var/lib/psa/dumps; /root/cli-tools/s3/s3cmd -c /root/cli-tools/s3/s3cfg --delete-removed -H --no-progress sync clients s3://example.com/backups
instead of, or possibly in association with, the above line.
When I commission a new Plesk server, I just copy the s3cmd directory over, create a new bucket and I'm done.
The only downsides I see, really, are that if I wanted to just have a single rule for all of the domains, I couldn't. Also, I'm assuming that all of the backups have been run when the cron job runs once a day. Not that that matters too much, as I could just bump the cron job up to hourly if I liked and I wouldn't see much difference.
I think the biggest negative to this approach is that I'm pushing backups explicitly even if what has been backed up actually hasn't changed. That is to say, if Plesk does a backup everyday, then I push a new backup every day... even if nothing has changed since the last backup.
Anyway, I hope this helps someone as for me it was completely obvious once I realised it, but it took me an embarrassingly long time to get to it.
Subscribe to:
Posts (Atom)