cloning context git repository
Dear Mojca, as Philipp already reported some time ago (http://www.ntg.nl/pipermail/ntg-context/2014/077848.html), I’m afraid I cannot clone the git repo either: $ git clone http://git.contextgarden.net/context/context.git Cloning into 'context'... remote: Counting objects: 56495, done. remote: Compressing objects: 100% (11028/11028), done. error: RPC failed; result=18, HTTP code = 200MiB | 311.00 KiB/s fatal: The remote end hung up unexpectedly fatal: early EOF fatal: index-pack failed I can clone the Linux repo too: $ git clone https://github.com/torvalds/linux.git Cloning into 'linux'... remote: Reusing existing pack: 3654061, done. remote: Total 3654061 (delta 0), reused 0 (delta 0) Receiving objects: 100% (3654061/3654061), 1.00 GiB | 293.00 KiB/s, done. Resolving deltas: 100% (3016190/3016190), done. Checking connectivity... done. Checking out files: 100% (47427/47427), done. I’m using git-1.9.3 for Linux. I’m on Fedora 20, but I had the same results with Ubuntu 14.04 (which comes with git-1.9.1): $ git clone http://git.contextgarden.net/context/context.git Cloning into 'context'... remote: Counting objects: 56495, done. remote: Compressing objects: 100% (11028/11028), done. error: RPC failed; result=18, HTTP code = 200MiB | 294.00 KiB/s fatal: The remote end hung up unexpectedly fatal: early EOF fatal: index-pack failed There is a workaround for this at https://stackoverflow.com/a/22442535: git clone --depth 1 http://git.contextgarden.net/context/context.git git fetch --unshallow You replied to Philipp that it worked for you. I assume your OS is MacOSX. Has anyone else succeded or failed in cloning the context git repo? Which OS are you using? Many thanks for your help, Pablo -- http://www.ousia.tk
On Sat, Jun 21, 2014 at 10:23 PM, Pablo Rodriguez
Dear Mojca,
as Philipp already reported some time ago (http://www.ntg.nl/pipermail/ntg-context/2014/077848.html), I’m afraid I cannot clone the git repo either:
$ git clone http://git.contextgarden.net/context/context.git Cloning into 'context'... remote: Counting objects: 56495, done. remote: Compressing objects: 100% (11028/11028), done. error: RPC failed; result=18, HTTP code = 200MiB | 311.00 KiB/s fatal: The remote end hung up unexpectedly fatal: early EOF fatal: index-pack failed
I can clone the Linux repo too:
$ git clone https://github.com/torvalds/linux.git Cloning into 'linux'... remote: Reusing existing pack: 3654061, done. remote: Total 3654061 (delta 0), reused 0 (delta 0) Receiving objects: 100% (3654061/3654061), 1.00 GiB | 293.00 KiB/s, done. Resolving deltas: 100% (3016190/3016190), done. Checking connectivity... done. Checking out files: 100% (47427/47427), done.
I’m using git-1.9.3 for Linux. I’m on Fedora 20, but I had the same results with Ubuntu 14.04 (which comes with git-1.9.1):
$ git clone http://git.contextgarden.net/context/context.git Cloning into 'context'... remote: Counting objects: 56495, done. remote: Compressing objects: 100% (11028/11028), done. error: RPC failed; result=18, HTTP code = 200MiB | 294.00 KiB/s fatal: The remote end hung up unexpectedly fatal: early EOF fatal: index-pack failed
There is a workaround for this at https://stackoverflow.com/a/22442535:
git clone --depth 1 http://git.contextgarden.net/context/context.git git fetch --unshallow
You replied to Philipp that it worked for you. I assume your OS is MacOSX.
Has anyone else succeded or failed in cloning the context git repo? Which OS are you using?
Many thanks for your help,
Pablo -- http://www.ousia.tk
___________________________________________________________________________________ If your question is of interest to others as well, please add an entry to the Wiki!
maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context webpage : http://www.pragma-ade.nl / http://tex.aanhet.net archive : http://foundry.supelec.fr/projects/contextrev/ wiki : http://contextgarden.net
___________________________________________________________________________________
$ git clone http://git.contextgarden.net/context/context.git Cloning into 'context'... remote: Counting objects: 56495, done. remote: Compressing objects: 100% (11028/11028), done. remote: Total 56495 (delta 44019), reused 54783 (delta 42937) Receiving objects: 100% (56495/56495), 93.42 MiB | 3.24 MiB/s, done. Resolving deltas: 100% (44019/44019), done. Ubuntu 12.02 -- luigi
On Sat, Jun 21, 2014 at 10:23 PM, Pablo Rodriguez wrote:
Dear Mojca,
as Philipp already reported some time ago (http://www.ntg.nl/pipermail/ntg-context/2014/077848.html), I’m afraid I cannot clone the git repo either:
$ git clone http://git.contextgarden.net/context/context.git Cloning into 'context'... remote: Counting objects: 56495, done. remote: Compressing objects: 100% (11028/11028), done. error: RPC failed; result=18, HTTP code = 200MiB | 311.00 KiB/s fatal: The remote end hung up unexpectedly fatal: early EOF fatal: index-pack failed
I can clone the Linux repo too:
$ git clone https://github.com/torvalds/linux.git Cloning into 'linux'... remote: Reusing existing pack: 3654061, done. remote: Total 3654061 (delta 0), reused 0 (delta 0) Receiving objects: 100% (3654061/3654061), 1.00 GiB | 293.00 KiB/s, done. Resolving deltas: 100% (3016190/3016190), done. Checking connectivity... done. Checking out files: 100% (47427/47427), done.
I’m using git-1.9.3 for Linux. I’m on Fedora 20, but I had the same results with Ubuntu 14.04 (which comes with git-1.9.1):
$ git clone http://git.contextgarden.net/context/context.git Cloning into 'context'... remote: Counting objects: 56495, done. remote: Compressing objects: 100% (11028/11028), done. error: RPC failed; result=18, HTTP code = 200MiB | 294.00 KiB/s fatal: The remote end hung up unexpectedly fatal: early EOF fatal: index-pack failed
There is a workaround for this at https://stackoverflow.com/a/22442535:
git clone --depth 1 http://git.contextgarden.net/context/context.git git fetch --unshallow
You replied to Philipp that it worked for you. I assume your OS is MacOSX.
Has anyone else succeded or failed in cloning the context git repo? Which OS are you using?
Many thanks for your help,
Hi, from what I understand the issue is that GitLab times out after a while. We managed to reproduce the problem though and the admin recently increased the timeout from 30 to 300 seconds, but we cannot just increase the timeout to infinity. The reason why I wasn't able to reproduce the problem earlier was because I always had a fast connection (or I was downloading/uploading just a small diff). Maybe a more relevant stackoverflow question would be this one: http://stackoverflow.com/questions/18138047/how-to-configure-nginx-unicorn-t... plus filing a bug report to GitLab. As a workaround you can also fetch the files from my personal GitHub mirror. One day we should probably reinstate the repositories at gitorious and repo.cz (one minor issue is that the repos are not compatible – I fetched all commits from Marius' repo and rewrote them). And maybe create a project at GitHub, but "context" is already taken. Mojca
Here's the explanation (probably not very useful for users): http://stackoverflow.com/questions/21697107/timeout-on-https-requests-to-git... To me that looks like some kind of a flaw in design. I also figured out that the server constantly keeps complaining about excessive memory consumption. I added some comments to the issue tracker of an already opened ticket: https://github.com/gitlabhq/gitlabhq/issues/3882 Mojca
On 06/21/2014 11:32 PM, Mojca Miklavec wrote:
Hi,
from what I understand the issue is that GitLab times out after a while. We managed to reproduce the problem though and the admin recently increased the timeout from 30 to 300 seconds, but we cannot just increase the timeout to infinity.
Hi Mojca, many thanks for your reply. How about increasing the timeout to 600 seconds?
As a workaround you can also fetch the files from my personal GitHub mirror.
This is what I have done.
And maybe create a project at GitHub, but "context" is already taken.
Also contextgarden is already taken at GitHub. context is taken at Bitbucket, but not contexgarden. How about Bitbucket? Many thanks for your help, Pablo -- http://www.ousia.tk
On Sun, Jun 22, 2014 at 8:03 AM, Pablo Rodriguez wrote:
On 06/21/2014 11:32 PM, Mojca Miklavec wrote:
Hi,
from what I understand the issue is that GitLab times out after a while. We managed to reproduce the problem though and the admin recently increased the timeout from 30 to 300 seconds, but we cannot just increase the timeout to infinity.
Hi Mojca,
many thanks for your reply.
How about increasing the timeout to 600 seconds?
We can, but then at least we need to make a rough calculation of what transfer speed should still be supported. And no matter what value we use, it will break for people (I also get a lot of "memory limit exceeded" warnings in the logs, without even having any significant number of users). The repository is currently a bit below 100 MiB. And if I put a complete distribution into the repository as intended, we could probably set the time limit to a few days ;)
As a workaround you can also fetch the files from my personal GitHub mirror.
This is what I have done.
And maybe create a project at GitHub, but "context" is already taken.
Also contextgarden is already taken at GitHub.
Indeed. http://github.com/contextgarden/context
context is taken at Bitbucket, but not contexgarden. How about Bitbucket?
I need to make an account first. Mojca
On 06/22/2014 10:40 AM, Mojca Miklavec wrote:
On Sun, Jun 22, 2014 at 8:03 AM, Pablo Rodriguez wrote:
[...] How about increasing the timeout to 600 seconds?
We can, but then at least we need to make a rough calculation of what transfer speed should still be supported. And no matter what value we use, it will break for people (I also get a lot of "memory limit exceeded" warnings in the logs, without even having any significant number of users). The repository is currently a bit below 100 MiB.
And if I put a complete distribution into the repository as intended, we could probably set the time limit to a few days ;)
GitHub is then fine.
And maybe create a project at GitHub, but "context" is already taken.
Also contextgarden is already taken at GitHub.
Indeed.
Sorry, I overlooked that this was ConTeXt.
context is taken at Bitbucket, but not contexgarden. How about Bitbucket?
I need to make an account first.
Well, my previous question doesn’t make sense, if GitHub has a ConTeXt repo. Many thanks for your help, Pablo -- http://www.ousia.tk
On Sun, Jun 22, 2014 at 12:00 PM, Pablo Rodriguez wrote:
On 06/22/2014 10:40 AM, Mojca Miklavec wrote:
On Sun, Jun 22, 2014 at 8:03 AM, Pablo Rodriguez wrote:
[...] How about increasing the timeout to 600 seconds?
We can, but then at least we need to make a rough calculation of what transfer speed should still be supported. And no matter what value we use, it will break for people (I also get a lot of "memory limit exceeded" warnings in the logs, without even having any significant number of users). The repository is currently a bit below 100 MiB.
And if I put a complete distribution into the repository as intended, we could probably set the time limit to a few days ;)
GitHub is then fine.
No, it's not. They'll kick us out once we start putting gigabytes of binaries on their site.
And maybe create a project at GitHub, but "context" is already taken.
Also contextgarden is already taken at GitHub.
Indeed.
Sorry, I overlooked that this was ConTeXt.
You couldn't have know it. I have put the repository there after you asked. And there was not any description there earlier.
context is taken at Bitbucket, but not contexgarden. How about Bitbucket?
I need to make an account first.
Well, my previous question doesn’t make sense, if GitHub has a ConTeXt repo.
It still does if some users are addicts of BitBucket ;) In any case it would make sense to update the repository on Gitorious at some point. Mojca
On 06/22/2014 02:04 PM, Mojca Miklavec wrote:
On Sun, Jun 22, 2014 at 12:00 PM, Pablo Rodriguez wrote:
On 06/22/2014 10:40 AM, Mojca Miklavec wrote:
[...] And if I put a complete distribution into the repository as intended, we could probably set the time limit to a few days ;)
GitHub is then fine.
No, it's not. They'll kick us out once we start putting gigabytes of binaries on their site.
I understand their position. Huge binaries in GitHub would make them something different than a git hosting provider. BTW, what is the problem of distributing the huge binaries on CTAN?
Sorry, I overlooked that this was ConTeXt.
You couldn't have know it. I have put the repository there after you asked. And there was not any description there earlier.
I thought I hadn’t read a description, but I tend not to trust my memory too much ;-). If you allow me a suggestion for GitHub I’d remove the issue tracker on both repositories. Pablo -- http://www.ousia.tk
On Sun, Jun 22, 2014 at 5:09 PM, Pablo Rodriguez wrote:
On 06/22/2014 02:04 PM, Mojca Miklavec wrote:
On Sun, Jun 22, 2014 at 12:00 PM, Pablo Rodriguez wrote:
On 06/22/2014 10:40 AM, Mojca Miklavec wrote:
[...] And if I put a complete distribution into the repository as intended, we could probably set the time limit to a few days ;)
GitHub is then fine.
No, it's not. They'll kick us out once we start putting gigabytes of binaries on their site.
I understand their position. Huge binaries in GitHub would make them something different than a git hosting provider.
BTW, what is the problem of distributing the huge binaries on CTAN?
None (other than CTAN mirrors generating more traffic if they had to sync *too much*). Except that it wouldn't really help us to have anything on the CTAN mirrors if CTAN mirrors aren't set up to serve as rsync and git servers.
Sorry, I overlooked that this was ConTeXt.
You couldn't have know it. I have put the repository there after you asked. And there was not any description there earlier.
I thought I hadn’t read a description, but I tend not to trust my memory too much ;-).
If you allow me a suggestion for GitHub I’d remove the issue tracker on both repositories.
Thank you for the suggestion, done. I already had to explain people that I'm unable to accept pull requests (even though that was for a different mirror). I would have done it earlier, but I forgot about it. Mojca
participants (3)
-
luigi scarso
-
Mojca Miklavec
-
Pablo Rodriguez