open source terminal and scanning
Fairlight
fairlite at fairlite.com
Tue Oct 26 14:25:04 PDT 2010
When asked his whereabouts on Tue, Oct 26, 2010 at 01:53:30PM -0400,
Brian K. White took the fifth, drank it, and then slurred:
>
> My thinking with git was A) since cvs and svn existed already a long
> time, there must have been some reason "they" went through the bother of
> inventing git. and B) since the linux kernel uses git, those reasons
> probably have some merit.
>
> I'm not married to it exactly, but I have no reason to believe this
> reasoning is broken yet.
Read the Wikipedia entry on Git. Linus himself created Git. The reasons
listed are kind of a special case, given the size and scope of both the
linux kernel that Git was meant to handle management for, and the sizeable
base of users which would be working on the project. It reads like he
basically wrote his own for performance reasons.
But SubVersion isn't going anywhere, I don't think. In fact, they were
recently actually accepted to the Apache Foundation as an official project,
so the main site is now subversion.apache.org. And you know how Apache
is...heck, they're still supporting 1.3.xx, to their credit.
The size of project you're talking about, combined with the likely number
of contributors...comparing it to other projects that use SVN, I don't
think the performance boosts supposedly inherent in Git are something you'd
necessarily notice, and almost certainly less would need. PuTTY just isn't
that big. Now the linux kernel...yeah, I can see that being a beast and
requiring something a little special. But that's totally an oddball case.
I'd think something like X.org might fall into that territory as well,
since it's SO damned big.
> I'll give myself a little more time to either get git going or take you
> up on your offer if it's still open at that time.
>
> One think I know is I don't think I want to switch systems later even if
> it's relatively easy to export/import the data, simply because I don't
> want to tell users (all one or two of them haha) to change how they do
> things after they've gone through the bother to get set up, which is why
> I'm not just saying ok thanks immediately. I haven't examined the pros &
> cons of the different systems enough to pick one based on anything more
> than the speculation above.
Fair enough reasoning and argument.
> Maybe I'll just use a 3rd party like sourceforge or codeplex.
*shudder* I so dislike SF anymore. It's like CPAN, but -harder- to find
what you want half the time. And there are actually a few orders of
magnitude more abandoned projects--a large percentage of which never left
pre-planning phase.
That, and I really dislike the interface--and that's just for
finding/downloading. I haven't even -dealt- with the updating.
> Oh sure I get that. How will I ever keep up with the constant flood of
> registration requests and patch submissions?
*cackle* You'll find a way...somehow. :)
> Now if a mere 100 people worldwide could put up 7.5k each (I would hands
> down right out of my own personal pocket let alone get Aljex to as
> well), or a mere 50 people put up 15k (I might do that too if it was a
If you have 7.5k to blow, obviously the recession hasn't hit you horribly
hard. I'm glad for you. Seriously.
> sure and final thing), or any combination that adds up to 750k, and if
> the talk that was loosed at that meeting including from Bud himself was
> more than hot air, we could buy fp and relase it to all once and for all
> and start clearing up the plethora of easily addressed things that cause
> idiotic amounts of developer effort to work around.
Would be nice.
> Then again, again, one of the only reasons fp no longer competes with
> stuff like that is because the users who would gladly contribute, can't.
> Every day that goes by with things as they are, the less I think even I
> would bother even if I suddenly could do whatever I wanted. We've been
> working the last few years on all new completely web front ends to our
> stuff, and that has required us to make our stuff work more
> transactionally just as a natural consequence. Before that, it would
> have been laughably impractical to "rewrite everything in some other
> language and using some other db". But, once you are performing packaged
> discrete transactions instead of a rats nest tangle of multiple
> concurrent lookups and @triggers, it becomes almost easy to start
> swapping out pieces of the application from talking to filepro to
> talking to some other db, one piece at a time.
Exactly. Because you're not even really working with fP the way it was
intended, as one persistent execution of a binary, using the UI provided.
You've shoehorned it in (as I have many times) to a situation where it can
do what's needed, but it's not the most efficient tool for the job. We get
transactional states, but they're micro-runs of a monolithic binary, most
of which isn't even being used. This uses up far more system resources
than required, and comes with the burden of a user limit in a context where
"user" is an entirely different definition because the environment isn't
even remotely parallel. Now, if they'd ever released that API that Ron was
talking about AROUND FIVE YEARS AGO that was supposedly more or less done
but which has never materialised, that'd be another story entirely. Given
the development/relese times of FPGI, I estimate that the API should be
good for release in another 4-6 years--if it doesn't go the way of "Gold"
or "Jumpstart".
mark->
--
Audio panton, cogito singularis.
More information about the Filepro-list
mailing list