?

Log in

No account? Create an account

Wed, Mar. 4th, 2009, 09:18 am
I’ll just shed tears all over the place

I have to admit that I haven't played around with Git much beyond familiarising myself with the commands enough to get by with the various projects I know that are using it.

That's not to say I don't appreciate aspects of its design - it has a certain inherent elegance that seems to invite experimentation and it certainly follows the Unix philosophy of small tools designed to be chained together which also facilitates that.

But to be honest I've just not needed to use it. I use SVN which, for all its flaws (I've never really had a complaint about speed to be fair but every time I need to roll back I have to go looking for a blog post I read one time that explains how to do it) does me just fine. When I need to do more distributed work I just fire up the awesome SVK and Bob, is as they say, your Mother's Brother.

One of the questions that niggled at me though, especially in the wake of the democratising GitHub, was whether it would encourage siloing.

Back in the dark days when modem speeds were measured in baud and people still thought Digital Watches were a pretty neat idea there was the NCSA http daemon and Lo! many people maintained various patch sets against it and, when you wanted to run it you went and downloaded the main package and then you downloaded all the patches you wanted and then you tried to apply them, massaging bits here and there where the patch set hadn't tracked the main app.

It was, to be frank, a giant pain in the huevos.

And then came Apache. Literally "A Patchy Webserver" which rekickstarted (is that even a word?) the stalled development of httpd and collected together the various patches. In my opinion this was a good thing - it's possible some people disagree I suppose - and I fear a return to those early, bad times.

And what's this got to do with Git?

See, Git was designed to help with Linux and Linux doesn't really have a master code base - it has various trees representing different flavours and patches flow between them like, I dunno, bad ideas at a conspiracy theorists' convention. Or something.

That model works well for Linux. Maybe. I'll presume it does. Git eases the pain points expressed by Andrew Morton, the maintainer of the -mm tree, in this message entitled "This Just Isn't Working Any More"

In short, it makes siloing easier.

And that's awesome for them. But it's not what I want for 99.99% of open source projects I use. No matter how good the tools are I don't want to be spending time tracking fixes and feature patches round various Git repositories and assembling my own custom version. And as a developer I live in fear of someone saying "I know everyone else is running your code perfectly but when *I* run it under FooBar v1.6 with patches from here, here and here then it fails mysteriously"

I did console myself with the fact that this was a worst case scenario and that it was unlikely to happen for small libraries ... except this morning I was chatting with someone and

17:12  * jerakeen finds that someone has actually fixed the ruby-oauth code to actually
          _work_
17:12 <@jerakeen> assuming you pick the right one of the _27_ forks of the codebase on
                  github.



then later

17:17 <@jerakeen> I forked pelle-oauth a while ago to make my local code actually
                  _work_, because that was important to me
17:17 <@jerakeen> so it was going to get siloed _anyway-
17:17 <@jerakeen> thanks to github, I can tell who else has done what, and where things
                  have gone
17:17 <@muttley> did you push the changes back?
17:18 <@jerakeen> no, not at all. My changes were the equivalent of duct-tape round
                  things. I ripped half of them off from the mailing list, which was 30
                  messages deep in a discussion over what was the Right Way to do it


Now to be fair to jerakeen - he's one of the smartest and most pragmatic programmers I know so I'm pretty sure he doesn't enjoy the situation but he was forced into it by the prevailing development methodology of the library he wanted to use which ended up that way because the version control tool it uses explicitly encourages it.

To use an old and overdone meme



I'm hoping this is a one off case and that people are just relearning good development manners after forgetting them when presented with a new shiny, sparkly toy. But the cynic and the pessimist in me died a little inside.

Wed, Mar. 4th, 2009 06:33 pm (UTC)
rjw1

rubys will_paginate gem has gone the same way.

its bad enough that ruby people have decided that rubyforge is no longer the main repository for gems.

Wed, Mar. 4th, 2009 06:49 pm (UTC)
fanf

I think this is a consequence of an inactive maintainer. In the absence of github it'd be patches on mailing lists and web servers, like the pre-history of Apache.

Wed, Mar. 4th, 2009 07:09 pm (UTC)
deflatermouse

I hope so.

But I don't necessarily agree with your conclusion though - in my experience, in the face of an inactive maintainer then people tend to either start really hassling the maintainer until they hand over the reigns or they start an new 'definitive' fork.

The essence, I think, is that when it's easier to keep your code in a silo than it is to push back to trunk (or establish a new trunk) then people will take the patch of least resistance. Which is good for them but bad for the wider community.

...

I think I just argued for enforced code communism.

Wed, Mar. 4th, 2009 07:18 pm (UTC)
fanf

I suppose I agree with you to some extent. However another thing that might happen in the absence of github is a complete lack of published patches. A lot of expedient hacks to get something working are not good enough to be published "properly". Perhaps the zoo on github indicates that people are less embarrassed, or perhaps the reduced impedance means they can make a patch available without "publishing" it.

Thu, Mar. 5th, 2009 01:47 pm (UTC)
2shortplanks

The biggest problem I have with github is it's bloody hard to delete forks. It's the part of the workflow that's missing.

IDEAL GITHUB WORKFLOW:

1. Find bug
2. Find primary repository
3. FORK
4. Patch on fork
5. Push at maintainer
6. Maintainer accepts patches
7. YOU DELETE YOUR FORK

Have you noticed how easy it is to hit the "Fork" button? Why isn't there a just as easy to get at "Delete" button for forked projects?

In an ideal world, I'd like to have the option "autodelete this fork when it's integrated upstream".

Mon, Mar. 9th, 2009 07:36 pm (UTC)
ext_173490

+1. I always recommend everyone to patch Remedie (on github) to NOT fork and use git format-patch to send to gist. That way I don't need to care about the forked repositories anymore and lots of wasteful merges there.

Thu, Mar. 5th, 2009 09:29 pm (UTC)
ext_172486

The distributed nature of (git|hg|bzr) makes the chaos possible and even a bit more likely but, at least with git, it makes the chaos easier to calm.

Speculate for a moment that you've got a fancy new checkout of Thinger from Thinger's Official Subversion and you find an edge case that needs immediate fixing for your project to continue. You fix it. To share the love, you're stuck with a single patch blasted to a mailing list because you had to check it out over HTTP and you can't commit back.

Speculate now that Thinger is using Git (not necessarily GitHub but they make it even easier). You've got a copy of all of your changes, neatly annotated with commit messages that the Thinger maintainers can pull into a branch on their copy, examine and merge. Git makes takes the guesswork out of the patch-and-pray-it-works workflow.

Thu, Mar. 5th, 2009 10:47 pm (UTC)
deflatermouse

Oh, I get all the advantages of Git and the work flow you're describing is largely how I use SVK.

My problem is that whilst Git makes that workflow possible if people stop doing the final step of merging the branch back into trunk then we end up with little siloed versions of apps.

So, for example, I branch Thinger and fix a few bugs and you branch it and add WeebleFlitzing support and neither of use push our changes back and since the maintainer doesn't know about us he doesn't pull them back in.

Now J Random Person comes by and wants both my bug fixes and your WeebleFlitzing support and then he has to go to the effort of doing the merge just like in the old days of NCSA httpd.

I admit that's a worst case scenario I and really hoped that I was just being curmudgeonly but that sort of behaviour has already started to happen which is sooner than even I suspected.

The odd thing is that mostly I've seen it in the Ruby community. I don't know quite what to make of that.

Fri, Mar. 6th, 2009 12:18 pm (UTC)
edge_walker

The odd thing is that mostly I’ve seen it in the Ruby community.

“I just put it in vendor.”

Sat, Mar. 7th, 2009 11:45 pm (UTC)
rcrowley

(I hope the friendly people at GitHub see this thread.)

My point above was that Git makes it easy. You've got me though, Git doesn't make it *happen*. GitHub may be able to help here. It'd be fun to see a "community" version of every repo that's been forked. This version would automatically pull from all of the forks. That puts a lot of faith in the patches made by the forkers but the middle ground of notifying the originator when any commits are made to forks is a bit tedious and brings us back to NCSA-syndrome.

Stepping back for a moment, I don't think it's possible to make any of this effortless without completely trusting forkers-at-large. So Git simply making it easier to merge might be enough of a win to call it a day.

Mon, Mar. 16th, 2009 04:21 pm (UTC)
static_panic

it's okay to be curmudgeonly.

i had received patches from someone who forked my svn project into github. i appreciate getting patches (i love it really), but his fork was behind from my mainline and required a lot of hand holding to integrate his patches.

really frustrating, even for my really small project.

either way, AFAIK git makes it easy to send patches back up-stream to mainline; maybe its github that needs to work on facilitating this?

one of my projects at work uses it and the distributed nature has been a god-send.

Fri, May. 15th, 2009 10:27 am (UTC)
khnimabaledb

of course, people will learn something only if there is something happened




mike