This is extremely interesting. I think the main people to benefit from this would be self-deployment kind of people or freelance developers. Mostly freelance devs I believe.
For my stuff, I'd definitely use this - but my first order of business would be to talk people into getting proper hosting so I wouldn't have to. Obviously that's not possible for freelance gigs. I'll definitely use it when I need to deploy to X host. Normally though I do quite a decent job of remembering which files I need to upload, but now I don't have to!!
The only useful infrastructure that my university will give me is a login account on an ancient (locked down) solaris box, and an ftp account. I could make ftp:// links shareable to colleagues and point them to tortoisegit as a useful client for accessing and editing said stuff using their existing login credentials (via judicious use of chmod) and this would solve a significant problem.
Git (and Mercurial and friends) have event hooks that can be used for a variety of automation tasks. For instance, I have a repository that consists of Sphinx documentation for a project. When my webserver's copy of the repo receives new changesets, it updates to tip, builds the HTML and PDF targets, and if successful deploys them into the appropriate web-accessible locations.
Nifty hack, but doesn't ncftp only upload changed files by default? ISTR running into this exact problem a few years back trying to deploy CakePHP apps to cheapo hosts.
No, I moved to Rails; Cake was a short lived endeavour. Cake was actually what got me there. I liked the things it did but hated the way it was written. Then I found out they were basically trying to clone Rails in PHP, so I gave Rails a shot.
I used this a while back when I still did PHP work, deploying to shared hosts. I personally enjoyed using it and don't recall having had any issues with it. It's not much use if you have shell access of some sort on your server, though.
My organization's public-facing website is on a .mil domain.
Our sysadmins tell us that for compliance reasons, any server-side processing is prohibited and right now the only way to update the site is a combination of RDP and sneakernet. But, until recently, we could update it via ftp. (they got in trouble for it during our last inspection, hence the sneakernet)
I've been playing with the idea of building a new site for our public affairs folks (the ones who are responsible for content) based on pelican or jekyll. Using a static site generator will get us around the no server-side problem, but there is still the problem of needing to transfer an entire site's worth of data avery time there is a small edit made. If we still had ftp access to our server, this would solve that problem.
What about: rsync to the external flash drive (or whatever your sneakernet is based on), then rsync from the flash drive to your public-facing website.
The server hosting the public-facing site is across town (where the sysadmin works). The RDP step is for getting from our network to a non-public-facing machine that is physically right next to the public-facing machine.
Then, at two or three pre-appointed times during the week, the sysadmin transfers the contents of a folder onto a usb hard drive, and then eventually where it needs to go.
I suppose it's time to start reading the "Who is hiring" threads a little more closely.
stuck on a corporate network managed by dip shit admin's wont open up anything but ftp (and even that will require executive approval).
presume that rather then fighting them you could just use this. the public sector has taught me to look for technical solutions that negate human problems.
Put a cost on not getting what you want. An utterly extortionate one. That will sway the balance. No managerial position will want to be seen driving costs up. Works for me every time.
Fortunately I work at a company now where the software team rules with an iron fist and managers whimper behind their desks at the mighty shitstorm that happens when they put pointless walls up.
But there never was a problem. Those ftp client have been around since the mid-1990's. He created a "problem" in order to play with a favorite software: git. This is typical behavior. git is clearly among the softwares, systems (Linux?) and devices (iPhone?) that have "fanboys".
So we can be sure we'll be hearing more about git. More fabricated "problems" to solve. Fanboys are blinded to all of history and all else besides their chosen infatuation.
Meanwhile, no matter how wonderful git is, I still have to install multiple versioning systems (cvs, svn, hg and git, at the bare minimum), because programmers can't agree on just one. Now _that_ is a problem.
> git is clearly among the softwares, systems (Linux?) and devices (iPhone?) that have "fanboys".
They all do. There are MVS fanboys, VM fanboys, ITS fanboys (oh, yes), Windows fanboys, and so on, and so forth. It isn't the technology, it's the people.
True. There is nothing wrong with git. It is great software.
But like a good song, if the radio stations overplay it, they can ruin the enjoyment of it for some listeners.
I think maybe we (nerds) all have the urge to be fanboys. We all have some software that we really like. Yet there are many examples of people who resist the urge to be a fanboy. Alas, the ones who give in to it are the ones who post their follies on the web and announce it to the world. The ones who don't are silent.
HostGator and A Small Orange are both cheap shared hosts I can recommend that ssh. The only shared host I have run into that didn't support SSH was $25 / month for 1 MySQL database and some paltry amount of storage space.
I'm sure there are a lot of cheap shared hosts that don't provide SSH, but there are hundreds that do, and the big players (who are the most reliable anyway) definitely do.
He encountered a problem: "Most of the low-cost web hosting companies do not provide SSH or git support, but only FTP." He then solved that problem and decided to make the work available to others.
Did you forget who developed git? So in this case it's fine ;)
But honestly, I also have a hard time finding a use case for this. The question I ask my self over and over again: why not mount an FTP share and point your origin to it? In fact I did something similar when I was still in university. My desktop computer had my central gip repo, I either pushed to it locally via the FS or remotely via SSH.
Anyways, good thing to have git over FTP but I predict that it will never get traction because it seems to be in part a reimplementation of git?! (At least that's what the github page suggests because every basic git cmd is explained but with some git-ftp prefix... ;))