I've been messing around with GitLab as a self hosted alternative for a few years. I do like it, but it is resource intensive!
For the past few days I've been playing with Forgejo (from the Codeberg people). It is fantastic.
The biggest difference is memory usage. GitLab is Ruby on Rails and over a dozen services (gitlab itself, then nginx, postgrest, prometheus, etc). Forgejo is written in go and is a single binary.
I have been running GitLab for several years (for my own personal use only!) and it regularly slowly starts to use up the entirety of the RAM on a 16GB VM. I have only been playing with Forgejo for a few days, but I am using only 300MB of the 8 GB of RAM I allocated, and that machine is running both the server and a runner (it is idle but...).
I'm really excited about Forgejo and dumping GitLab. The biggest difference I can see if that Forgejo does not have GraphQL support, but the REST API seems, at first glance, to be fine.
EDIT: I don't really understand the difference between gitea and forgejo. Can anyone explain? I see lots of directories inside the forgejo volume when I run using podman that clearly indicate they are the same under the hood in many ways.
What exactly is the advantage of running something like GitLab vs what I do which is just a server with SSH and a file system? To create a new repo I do:
Then I just set my remote origin URL to example.com:repos/my-proj.git
The filesystem on example.com is backed up daily. Since I do not need to send myself pull requests for personal projects and track my own TODOs and issues via TODO.md, what exactly am I missing? I have been using GitHub for open source projects and work for years but for projects where I am the only author, why would I need a UI besides git and my code editor of choice?
Collaboration and specifically collaboration with non git nerds. That's primarily what made GitHub win the VCS wars back in the day. The pull request model appealed to anyone who didn't want to learn crafting and emailing patches.
Our product studio with currently around 50 users who need daily git access moved to a self hosted forgejo nearly 2 years ago.
I really can’t overstate the positive effects of this transition. Forgejo is a really straightforward Go service with very manageable mental model for storage and config. It’s been easy and cheap to host and maintain, our team has contributed multiple bugfixes and improvements and we’ve built a lot of internal tooling around forgejo which otherwise would’ve required a much more elaborate (and slow) integration with GitHub.
Our main instance is hosted on premise, so even in the extremely rare event of our internet connection going offline, our development and CI workflows remain unaffected (Forgejo is also a registry/store for most package managers so we also cache our dependencies and docker images).
Just run podman or docker login your.forgejo.instance.address then push to it as normal. An existing repo must exist. You can check the images under site administration -> packages.
Speaking of authentication it also works as an openid provider meaning you can authenticate every other web software that supports it to Forgejo... which in turn can look for users in other sources.
It also has wikis.
Its an underrated piece of software that uses a ridiculous small amount of computer resources.
That's so brilliant. Wow. I'm struggling to wrap my brain around how they not only support OCI (docker) but also APK (alpine) and APT (debian) packages. That's a very cool feature.
Ease of maintenance is an even bigger difference. We've been using gitea for a bit over five years now, and gitlab for a few years before that, and gitea requires no maintenance in comparison. Upgrades come down to pulling the new version and restarting the daemon, and take just a few seconds. It's definitely the best solution for self-hosters who want to spend as little time as possible on their infrastructure.
Backups are handled by zfs snapshots (like every other server).
We've also had at least 10× lower downtime compared to github over the same period of time, and whatever downtime we had was planned and always in the middle of the night. Always funny reading claims here that github has much better uptime than anything self-hosted from people who don't know any better. I usually don't even bother responding anymore.
I guess I'll just chime in that while Gitlab is a very heavy beast, I have self hosted it for over a decade with little to no issues. It's pretty much as simple as installing their Omnibus package repository and doing apt install gitlab-ce.
I found gitea's interface to be so unusably bad that i switched to full-fat GitLab.
Gitea refused to do some perfectly sensible action- I think it had something to do with creating a fork of my own repo. Looking online, there's zero technical reason for this, and the explanation given was "this is how GitHub does things". Immediately uninstalled. I'm not here for this level of disrespect.
When I self hosted gitlab I never found the maintenance to be that bad, just change a version in a compose.yml, sometimes having to jump between blessed versions if I've missed a few back to back.
Like others, I've switch to Gitea, but whenever I do visit gitlab I can't help but think the design / UX is so much nicer.
It's a shame that GitHub won the CI race by sheer force of popularity and it propagates its questionable design decisions. I wish more VCS platforms would base their CI systems on Gitlab, which is much much better than GitHub actions.
One concern the post brings up - single point of failure. Yes, in this case, blah blah big company microsoft blah blah (I don't disagree, but..). I'm more worried about places like Paypal/Google/etc banning than the beast from Redmond.
Self hosting, it's still a single point of failure and the article arguing "mirroring", well... it allows redundancy with reads but writes?
I think it’s a fair concern, e.g. forgejo is a simple directory on disk, with an option to make that into an S3 storage. It really is a no brainer to set that up for as much resilience as necessary with various degrees of “advanced” depending on your thread model and experience. The lack of a FAANG/M in the equation makes it even more palatable.
If you want even more minimal, Gerrit is structured as a Java app with no external dependencies like databases, and stores all it's configuration and runtime information on the filesystem, mostly as data structures in the git repos.
Shared filesystems is all you need to scale/replicate it, and it also makes the backup process quite simple.
I might be one of the few that is intrigued by this being that it’s Java but this looks really neat. Does it do git repositories like gitea, GitHub, etc, or is it more of a project management site for the repositories? They describe it as “code review”, so I wasn’t sure.
I’m a little put off on the google connection but it seems like it could run rather independently.
We've been looking at Forgejo too. Do you have any experience with Forgejo Actions you can share? That is one thing we are looking at with a little trepidation.
I setup actions yesterday. There are a few tiny rough edges, but it is definitely working for me. I'm using it to build my hugo blog which "sprinklylls" in a Svelte app, so it needs to have nodejs + hugo and a custom orchestrator written in Zig.
What I did:
* used a custom docker image on my own registry domain with hugo/nodejs and my custom zig app
* no problems
* store artifacts
* required using a different artifact "uses" v3 instead of v4 (uses: actions/upload-artifact@v3)
* An example of how there are some subtle differences between GitHub Actions, but IMHO, this is a step forward because GitLab CI YAML is totally different
* can't browse the artifacts like I can on gitlab, only allows download of the zip. Not a big deal, but nice to verify without littering my Downloads folder.
* Unable to use "forgejo-runner exec" which I use extensively to test whether a workflow is correct before pushing
* Strange error: "Error: Open(/home/runner/.cache/actcache/bolt.db): timeout"
* I think GitLab broke this feature recently as well!
* Getting the runner to work with podman and as a service was a little tricky (but now works)
* Mostly because of the way the docker socket is not created by default on podman
* And the docker_host path is different inside the runner config file.
* There are two config files, one (JSON) is always stored in .runner and contains the auth information and IP, and the other is YAML and runner needs the -c switch to specify it, and has the config of the runner (docker options, etc). It's a bit strange there are two files IMHO.
This will occur if you have a `forgejo-runner daemon` running while you try to use `exec` -- both are trying to open the cache database, and only the first to open it can operate. You could avoid this by changing the cache directory of the daemon by changing `cache.dir` in the config file, or run the two processes as different users.
> It's a bit strange there are two files IMHO.
The `.runner` file isn't a config file, it's a state file -- not intended for user editing. But yes, it's a bit odd.
We use them in our shop. It's quite straightforward if you're already familiar with Github Actions. The Forgejo runner is tiny and you can build it even on unsupported platforms (https://code.forgejo.org/forgejo/runner) e.g. we've setup our CI to also run on Macs (by https://www.oakhost.net) for App Store related builds. It's really quite a joy :)
> GitHub has been useful to store all repositories of the Dillo project, as well as to run the CI workflows for platforms in which I don't have a machine available (like Windows, Mac OS or some BSDs).
The post does not mention CI anywhere else, are they doing anything with it, keeping it on GitHub, or getting rid of it?
> Furthermore, the web frontend doesn't require JS, so I can use it from Dillo (I modified cgit CSS slightly to work well on Dillo).
That sounds like a bad approach to developing a Web browser, surely it would be better to make Dillo correctly work with the default cgit CSS (which is used by countless projects)?
> To avoid this problem, I created my own bug tracker software, buggy, which is a very simple C tool that parses plain Markdown files and creates a single HTML page for each bug.
> Additionally, GitHub seems to encourage a "push model" in which you are notified when a new event occurs in your project(s), but I don't want to work with that model. Instead, I prefer it to work as a "pull model", so I only get updates when I specifically look for them. This model would also allow me to easily work offline. Unfortunately, I see that the same push model has been copied to alternative forges.
Someone kind enough to explain this to me? What's the difference between push model and pull model? What about push model makes it difficult to work offline?
I would love to see more projects use git-bug, which works very well for offline collaboration. All bug tracker info is stored in the repo itself. https://github.com/git-bug/git-bug
It still needs work to match the capabilities of most source forges, but for small closed teams it already works very well.
Reminder that POP and IMAP are protocols, and nothing stops a code forge—or any other website—from exposing the internal messaging/notification system to users as a service on the standard IMAP ports; no one is ever required to set up a bridge/relay that sends outgoing messages to, say, the user's Fastmail/Runbox/Proton/whatever inbox. You can just let the user point their IMAP client to _your_ servers, authenticate with their username and password, and fetch the contents of notifications that way. You don't have to implement server-to-server federation typically associated with email (for incoming messages), and you don't have to worry about deliverability for outgoing mail.
All of this makes sense. Thank you for explaining. I don't think I understand the difference though.
Like are they calling the "GitHub pull request" workflow as the push model? What is "push" about it though? I can download all the pull request patches to my local and work offline, can't I?
GitHub pull request pushes you a notification/e-mail to handle the merge, and you have to handle the pull request mostly online.
I don't know how you can download the pull request as a set of patches and work offline, but you have to open a branch, merge the PR to that branch, test the things and merge that branch to relevant one.
Or you have to download the forked repository, do your tests to see the change is relevant/stable whatnot and if it works, you can then merge the PR.
---
edit: Looks like you can get the PR as a patch or diff, and is trivial, but you have to be online again to get it that way. So, getting your mails from your box is not enough, you have to get every PR as a diff, with a tool or manually. Then you have to organize them. e-mails are much more unified and simple way to handle all this.
---
In either case, reviewing the changes is not possible when you're offline, plus the pings of the PRs is distracting, if your project is popular.
Seems like you found it, but for others: one of the easiest ways to get a PR's diff/patch is to just put .diff or .patch at the end of its URL. I use this all the time!
It’s bonkers to me that there isn’t a link to the plan patch from the page. Yes, it’s trivial to add a suffix once you know, but lots of people don’t—as evidenced by this thread.
Discoverability in UX seems to have completely died.
You could set up a script that lives in the cloud (so you don't have to), receives PRs through webhooks, fetches any associated diff, and stores them in S3 for you to download later.
Maybe another script to download them all at once, and apply each diff to its own own branch automatically.
Almost everything about git and github/gitlab/etc. can be scripted. You don't have to do anything on their website if you're willing to pipe some text around the old way.
I would say it is time/life management: push tells you to do something now. In pull I check each Friday afternoon what's up in my hobby project and work on it for a few hours and then call it a day and be uninterrupted till next week.
For real. I've been hearing the interface is slow and requires Javascript for years and never really paid much mind, it worked for me. But lately the page loading has gotten abusively slow. I don't think it can be simply blamed on React because that move was made long before this started.
I've taken to loading projects in github.dev for navigating repos so I pay the js tax just once and it's fine for code reading. But navigating PRs and actions is terrible.
Another social issue on GitHub: you cannot use the "good first issue" tag on a public repository without being subjected to low quality drive-by PRs or AI slop automatically submitted by someone's bot.
I think the issue with centralization is still understated. I know developers who seem to struggle reading code if it's not presented by VS Code or a GitHub page. And then, why not totally capture everyone into developing just with GitHub Codespaces?
This is exactly what well-intentioned folk like to see: it's solving everyone's problems! Batteries included, nothing else is needed! Why use your own machine or software that doesn't ping into a telemetry hell-hole of data collection on a regular basis?
> To avoid this problem, I created my own bug tracker software, buggy, which is a very simple C tool that parses plain Markdown files and creates a single HTML page for each bug.
I love this. I used to be a big fan of linear (because the alternatives were dog water), but this also opened the question "why even have a seperate, disconnected tool?"
Most of my personal projects have a TODO.md somewhere with a list of things i need to work on. If people really need a frontend for bugs, it wouldn't be more than just rendering that markdown on the web.
Well, if your bugs can be specified clearly in plain text and plain text only, then yeah, I'd also advocate for this approach. Unfortunately, that's not really the case in any bigger software project. I need screenshots, video recordings that are 100 megs, cross-issue linking etc. I hate JIRA (of course) but it gets it right.
We are in the disapora phase; there is a steady stream of these announcements, each with a different GitHub alternative. I speculate that within a few months, the communities will have settled on a single dominant one. I'm curious if it will be one of the existing ones, or something new. Perhaps a well-known company or individual will announce one; it will have good marketing, and dominate.
This has been going on for a decade, at the beginning it was projects moving to Gitlab now there's a lot of alternative projects but GitHub is still the only one that counts for discoverability. This is a very small minority of projects that move away from Github and it's way too early to declare GitHub doomed.
No different than everyone talking about the next “iPhone Killer” when someone other than Apple releases a phone. Although, I think that rhetoric has largely died down.
Different devs have different preferred ways to work and collaborate. I doubt the FOSS community will converge on a single solution. I think we’re at a point of re-decentralization, where devs will move their projects to the forge that satisfies their personal/group requirements for control, hosting jurisdiction, corporate vs community ownership, workflow, and uptime.
This is due to increasing competition in the source forge space. It’s good that different niches can be served by their preferred choice, even if it will be less convenient for devs who want to contribute a patch on a more obscure platform.
> I speculate that within a few months, the communities will have settled on a single dominant one.
The solutions on the roadmap are not centralized as GitHub. There is a real initiative to promote federation so we would not need to rely on one entity.
I love this, and hope it works out this way. Maybe another way to frame it: In 2 years, what will the "Learn Python for Beginners" tutorials direct the user towards? Maybe there will not be a consensus, but my pattern-matching brain finds one!
> Perhaps a well-known company or individual will announce one; it will have good marketing, and dominate.
Hah, exactly what we’re attempting with Tangled! Some big announcements to come fairly soon. We’re positioning ourselves to be the next social collab platform—focused solely on indies & communities.
It looks like all that they’re doing is griping over frontends and interfaces to do all the custodial work other than version control (ie., all baked-in git provisions).
GitLab is too heavyweight for many projects. It’s great for corporations or big organizations like GNOME, but it’s slow and difficult to administer. It has an important place in the ecosystem, but I doubt many small projects will choose it over simpler alternatives like Codeberg.
Gitlab is part of the reason I'm thinking along these lines: It has been around for a while, as a known, reasonably popular alternative to GitHub. So, I expected the announcement to be "We moved to GitLab", Yet, what I observe is "We moved to CodeHouse" or "We moved to Source-Base" The self-hosting here with mirrors to two one I'm not familiar with is another direction.
>frontend barely works without JavaScript, ... In the past, it used to gracefully degrade without enforcing JavaScript, but now it doesn't.
And the github frontend developers are aware of these accessibility problems (via the forums and bug reports). They just don't care anymore. They just want to make the site appear to work at first glance which is why index pages are actual text in html but nothing else is.
I'd love to hear the inside story of GitHub's migration of their core product features to React.
It clearly represents a pretty seismic cultural change within the company. GitHub was my go-to example of a sophisticated application that loaded fast and didn't require JavaScript for well over a decade.
The new React stuff is sluggish even on a crazy fast computer.
My guess is that the "old guard" who made the original technical decisions all left, and since it's been almost impossible to hire a frontend engineer since ~2020 or so that wasn't a JavaScript/React-first developer the weight of industry fashion became too much to resist.
But maybe I'm wrong and they made a technical decision to go all-in on heavy JavaScript features that was reasoned out by GitHub veterans and accompanied by rock solid technical justification.
GitHub have been very transparent about their internal technical decisions in the past. I'd love to see them write about this transition.
> But beyond accessibility and availability, there is also a growing expectation of GitHub being more app-like.
> The first case of this was when we rebuilt GitHub projects. Customers were asking for features well beyond our existing feature set. More broadly, we are seeing other companies in our space innovate with more app-like experiences.
> Which has led us to adoption React. While we don’t have plans to rewrite GitHub in React, we are building most new experiences in React, especially when they are app-like.
> We made this decision a couple of years ago, and since then we’ve added about 250 React routes that serve about half of the average pages used by a given user in a week.
It then goes on to talk about how mobile is the new baseline and GitHub needed to build interfaces that felt more like mobile apps.
(Personally I think JavaScript-heavy React code is a disaster on mobile since it's so slow to load on the median (Android) device. I guess GitHub's core audience are more likely to have powerful phones?)
For contrast, gitea/forgejo use as little JavaScript as possible, and have been busy removing frontend libraries over the past year or so. For example, jquery was removed in favor of native ES6+.
Let them choke on their "app-like experience", and if you can afford it, switch over to either one. I cannot recommend it enough after using it "in production" daily for more than five years.
I honestly believe that the people involved likely already wanted to move over to React/SPAs for one reason or another, and were mostly just searching for excuses to do so - hence these kind of vague and seemingly disproportional reasons. Mobile over desktop? Whatever app-like means over performance?
Non-technical incentives steering technical decisions is more common than we'd perhaps like to admit.
It's 1 step forward 2 steps back with this "server side rendering" framing of the issue and in practice observing Microsoft Github's behaviors. They'll temporarily enable text on the web pages of the site in response to accessibility issues then a few months later remove it on that type of page and even more others. As that thread and others I've participated in show this is a losing battle. Microsoft Github will be javascript application only in the end. Human people should consider moving their personal projects accordingly. For work, well one often has to do very distasteful and unethical things for money. And github is where the money is.
GitHub frontend is mostly still their own [1] Web Components based library. They use Turbo to do client side reloading.
They have small islands of React based views like Projects view or reworked Pull Request review.
The thing is, even if you disable JavaScript, sites still load sloow. Try it yourself. Frontend code doesn’t seem to be the bottleneck.
What would be nice is an aggregator site one could submit to and everyone just host it on their own internet connection, and nobody be dependent on a source for hosting their projects. Maybe something like bluesky with the AT protocol but with git repositories.
I hope you will continue maintaining a mirror in GH. Some tools like deepwiki are excellent resources to learn about a codebase when their is not much documentation going around. But these tools only support pulling from GH.
A neat thing about GitHub is that every file on it can be accessed from URLs like https://raw.githubusercontent.com/simonw/llm-prices/refs/hea... which are served through a CDN with open CORS headers - which means any JavaScript application running anywhere can access them.
It's less about pulling and more about tools like DeepWiki making the assumption that its inputs live in GitHub, so repository URLs are expected to be GH URLs as opposed to a URL to a git repository anywhere.
That being said, there's no reason for tools like it to have those constraints other than pushing users into an ecosystem they prefer (i.e. GitHub instead of other forges).
I've been messing around with GitLab as a self hosted alternative for a few years. I do like it, but it is resource intensive!
For the past few days I've been playing with Forgejo (from the Codeberg people). It is fantastic.
The biggest difference is memory usage. GitLab is Ruby on Rails and over a dozen services (gitlab itself, then nginx, postgrest, prometheus, etc). Forgejo is written in go and is a single binary.
I have been running GitLab for several years (for my own personal use only!) and it regularly slowly starts to use up the entirety of the RAM on a 16GB VM. I have only been playing with Forgejo for a few days, but I am using only 300MB of the 8 GB of RAM I allocated, and that machine is running both the server and a runner (it is idle but...).
I'm really excited about Forgejo and dumping GitLab. The biggest difference I can see if that Forgejo does not have GraphQL support, but the REST API seems, at first glance, to be fine.
EDIT: I don't really understand the difference between gitea and forgejo. Can anyone explain? I see lots of directories inside the forgejo volume when I run using podman that clearly indicate they are the same under the hood in many ways.
EDIT 2: Looks like forgejo is a soft fork in 2022 when there were some weird things that happened to governance of the gitea project: https://forgejo.org/compare-to-gitea/#why-was-forgejo-create...
What exactly is the advantage of running something like GitLab vs what I do which is just a server with SSH and a file system? To create a new repo I do:
Then I just set my remote origin URL to example.com:repos/my-proj.gitThe filesystem on example.com is backed up daily. Since I do not need to send myself pull requests for personal projects and track my own TODOs and issues via TODO.md, what exactly am I missing? I have been using GitHub for open source projects and work for years but for projects where I am the only author, why would I need a UI besides git and my code editor of choice?
What exactly is the advantage of running something like a restaurant vs what I do at home which is just cook it myself?
-> convenience, collaboration, mobility
Collaboration and specifically collaboration with non git nerds. That's primarily what made GitHub win the VCS wars back in the day. The pull request model appealed to anyone who didn't want to learn crafting and emailing patches.
> why would I need a UI besides git and my code editor of choice?
If you ever find yourself wishing for a web UI as well, there's cgit[1]. It's what kernel.org uses[2].
[1]: https://git.zx2c4.com/cgit/ [2]: https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...
> I'm really excited about Forgejo
Our product studio with currently around 50 users who need daily git access moved to a self hosted forgejo nearly 2 years ago.
I really can’t overstate the positive effects of this transition. Forgejo is a really straightforward Go service with very manageable mental model for storage and config. It’s been easy and cheap to host and maintain, our team has contributed multiple bugfixes and improvements and we’ve built a lot of internal tooling around forgejo which otherwise would’ve required a much more elaborate (and slow) integration with GitHub.
Our main instance is hosted on premise, so even in the extremely rare event of our internet connection going offline, our development and CI workflows remain unaffected (Forgejo is also a registry/store for most package managers so we also cache our dependencies and docker images).
Wait, forgejo offers a built-in container registry? How does that work? I don't see that in the admin section at all.
Container registry and a lot more, they call it Package registry in the docs https://forgejo.org/docs/latest/user/packages/
Just run podman or docker login your.forgejo.instance.address then push to it as normal. An existing repo must exist. You can check the images under site administration -> packages.
Speaking of authentication it also works as an openid provider meaning you can authenticate every other web software that supports it to Forgejo... which in turn can look for users in other sources.
It also has wikis.
Its an underrated piece of software that uses a ridiculous small amount of computer resources.
That's so brilliant. Wow. I'm struggling to wrap my brain around how they not only support OCI (docker) but also APK (alpine) and APT (debian) packages. That's a very cool feature.
Ease of maintenance is an even bigger difference. We've been using gitea for a bit over five years now, and gitlab for a few years before that, and gitea requires no maintenance in comparison. Upgrades come down to pulling the new version and restarting the daemon, and take just a few seconds. It's definitely the best solution for self-hosters who want to spend as little time as possible on their infrastructure.
Backups are handled by zfs snapshots (like every other server).
We've also had at least 10× lower downtime compared to github over the same period of time, and whatever downtime we had was planned and always in the middle of the night. Always funny reading claims here that github has much better uptime than anything self-hosted from people who don't know any better. I usually don't even bother responding anymore.
I guess I'll just chime in that while Gitlab is a very heavy beast, I have self hosted it for over a decade with little to no issues. It's pretty much as simple as installing their Omnibus package repository and doing apt install gitlab-ce.
I found gitea's interface to be so unusably bad that i switched to full-fat GitLab.
Gitea refused to do some perfectly sensible action- I think it had something to do with creating a fork of my own repo. Looking online, there's zero technical reason for this, and the explanation given was "this is how GitHub does things". Immediately uninstalled. I'm not here for this level of disrespect.
When I self hosted gitlab I never found the maintenance to be that bad, just change a version in a compose.yml, sometimes having to jump between blessed versions if I've missed a few back to back.
Like others, I've switch to Gitea, but whenever I do visit gitlab I can't help but think the design / UX is so much nicer.
https://forgejo.org/docs/latest/user/actions/basic-concepts/
It's a shame that GitHub won the CI race by sheer force of popularity and it propagates its questionable design decisions. I wish more VCS platforms would base their CI systems on Gitlab, which is much much better than GitHub actions.
One concern the post brings up - single point of failure. Yes, in this case, blah blah big company microsoft blah blah (I don't disagree, but..). I'm more worried about places like Paypal/Google/etc banning than the beast from Redmond.
Self hosting, it's still a single point of failure and the article arguing "mirroring", well... it allows redundancy with reads but writes?
It's an interesting take on a purist problem.
I think it’s a fair concern, e.g. forgejo is a simple directory on disk, with an option to make that into an S3 storage. It really is a no brainer to set that up for as much resilience as necessary with various degrees of “advanced” depending on your thread model and experience. The lack of a FAANG/M in the equation makes it even more palatable.
If you want even more minimal, Gerrit is structured as a Java app with no external dependencies like databases, and stores all it's configuration and runtime information on the filesystem, mostly as data structures in the git repos.
Shared filesystems is all you need to scale/replicate it, and it also makes the backup process quite simple.
I might be one of the few that is intrigued by this being that it’s Java but this looks really neat. Does it do git repositories like gitea, GitHub, etc, or is it more of a project management site for the repositories? They describe it as “code review”, so I wasn’t sure.
I’m a little put off on the google connection but it seems like it could run rather independently.
The deployment may be simple, but at the same time, the Gerrit code review workflow is terrible.
We've been looking at Forgejo too. Do you have any experience with Forgejo Actions you can share? That is one thing we are looking at with a little trepidation.
I setup actions yesterday. There are a few tiny rough edges, but it is definitely working for me. I'm using it to build my hugo blog which "sprinklylls" in a Svelte app, so it needs to have nodejs + hugo and a custom orchestrator written in Zig.
What I did:
> * Strange error: "Error: Open(/home/runner/.cache/actcache/bolt.db): timeout"
This will occur if you have a `forgejo-runner daemon` running while you try to use `exec` -- both are trying to open the cache database, and only the first to open it can operate. You could avoid this by changing the cache directory of the daemon by changing `cache.dir` in the config file, or run the two processes as different users.
> It's a bit strange there are two files IMHO.
The `.runner` file isn't a config file, it's a state file -- not intended for user editing. But yes, it's a bit odd.
We use them in our shop. It's quite straightforward if you're already familiar with Github Actions. The Forgejo runner is tiny and you can build it even on unsupported platforms (https://code.forgejo.org/forgejo/runner) e.g. we've setup our CI to also run on Macs (by https://www.oakhost.net) for App Store related builds. It's really quite a joy :)
> GitHub has been useful to store all repositories of the Dillo project, as well as to run the CI workflows for platforms in which I don't have a machine available (like Windows, Mac OS or some BSDs).
The post does not mention CI anywhere else, are they doing anything with it, keeping it on GitHub, or getting rid of it?
> Furthermore, the web frontend doesn't require JS, so I can use it from Dillo (I modified cgit CSS slightly to work well on Dillo).
That sounds like a bad approach to developing a Web browser, surely it would be better to make Dillo correctly work with the default cgit CSS (which is used by countless projects)?
> To avoid this problem, I created my own bug tracker software, buggy, which is a very simple C tool that parses plain Markdown files and creates a single HTML page for each bug.
The hacker spirit alive and well.
For better and for worse.
> Additionally, GitHub seems to encourage a "push model" in which you are notified when a new event occurs in your project(s), but I don't want to work with that model. Instead, I prefer it to work as a "pull model", so I only get updates when I specifically look for them. This model would also allow me to easily work offline. Unfortunately, I see that the same push model has been copied to alternative forges.
Someone kind enough to explain this to me? What's the difference between push model and pull model? What about push model makes it difficult to work offline?
AFAIK, the author wants to work like how Source Hut and Linux kernel works: by e-mails.
When you're working with e-mails, you sync your relevant IMAP box to local, pulling all the proposed patches with it, hence the pull model.
Then you can work through the proposed changes offline, handle on your local copy and push the merged changes back online.
I would love to see more projects use git-bug, which works very well for offline collaboration. All bug tracker info is stored in the repo itself. https://github.com/git-bug/git-bug
It still needs work to match the capabilities of most source forges, but for small closed teams it already works very well.
Reminder that POP and IMAP are protocols, and nothing stops a code forge—or any other website—from exposing the internal messaging/notification system to users as a service on the standard IMAP ports; no one is ever required to set up a bridge/relay that sends outgoing messages to, say, the user's Fastmail/Runbox/Proton/whatever inbox. You can just let the user point their IMAP client to _your_ servers, authenticate with their username and password, and fetch the contents of notifications that way. You don't have to implement server-to-server federation typically associated with email (for incoming messages), and you don't have to worry about deliverability for outgoing mail.
All of this makes sense. Thank you for explaining. I don't think I understand the difference though.
Like are they calling the "GitHub pull request" workflow as the push model? What is "push" about it though? I can download all the pull request patches to my local and work offline, can't I?
GitHub pull request pushes you a notification/e-mail to handle the merge, and you have to handle the pull request mostly online.
I don't know how you can download the pull request as a set of patches and work offline, but you have to open a branch, merge the PR to that branch, test the things and merge that branch to relevant one.
Or you have to download the forked repository, do your tests to see the change is relevant/stable whatnot and if it works, you can then merge the PR.
---
edit: Looks like you can get the PR as a patch or diff, and is trivial, but you have to be online again to get it that way. So, getting your mails from your box is not enough, you have to get every PR as a diff, with a tool or manually. Then you have to organize them. e-mails are much more unified and simple way to handle all this.
---
In either case, reviewing the changes is not possible when you're offline, plus the pings of the PRs is distracting, if your project is popular.
Seems like you found it, but for others: one of the easiest ways to get a PR's diff/patch is to just put .diff or .patch at the end of its URL. I use this all the time!
Random PR example, https://github.com/microsoft/vscode/pull/280106 has a diff at https://github.com/microsoft/vscode/pull/280106.diff
Another thing that surprises some is that GitHub's forks are actually just "magic" branches. I.e the commits on a fork exist in the original repo: https://github.com/microsoft/vscode/commit/8fc3d909ad0f90561...
It’s bonkers to me that there isn’t a link to the plan patch from the page. Yes, it’s trivial to add a suffix once you know, but lots of people don’t—as evidenced by this thread.
Discoverability in UX seems to have completely died.
> It’s bonkers to me that there isn’t a link to the plan patch from the page.
It's yet another brick on the wall of the garden. That's left there for now, but for how long?
IOW, It's deliberate. Plus, GitHub omits to add trivial features (e.g.: deleting projects, "add review" button, etc.) while porting their UI.
It feels like they don't care anymore.
You could set up a script that lives in the cloud (so you don't have to), receives PRs through webhooks, fetches any associated diff, and stores them in S3 for you to download later.
Maybe another script to download them all at once, and apply each diff to its own own branch automatically.
Almost everything about git and github/gitlab/etc. can be scripted. You don't have to do anything on their website if you're willing to pipe some text around the old way.
Why complicate the workflow when it can be solved with a simple e-mail?
> Almost everything about git and github/gitlab/etc. can be scripted.
Moving away from GitHub is more philosophical than technical at this point. I also left the site the day they took Copilot to production.
I would say it is time/life management: push tells you to do something now. In pull I check each Friday afternoon what's up in my hobby project and work on it for a few hours and then call it a day and be uninterrupted till next week.
We’d love to have the Dillo project on Tangled! ;) https://tangled.org
Nice to see it works with no JS
I wish Tangled supported alternatives to Git
If anyone wants to add Forgejo to your VM, I made a script that allows you to quickly install server + runner, so you get the full setup:
https://wkoszek.github.io/easyforgejo/
> On the usability side, the platform has become more and more slow over time
The best reason right here.
For real. I've been hearing the interface is slow and requires Javascript for years and never really paid much mind, it worked for me. But lately the page loading has gotten abusively slow. I don't think it can be simply blamed on React because that move was made long before this started.
I've taken to loading projects in github.dev for navigating repos so I pay the js tax just once and it's fine for code reading. But navigating PRs and actions is terrible.
Seems as a good idea to pitch git-appraise https://github.com/google/git-appraise
I'm not part of the project at all, but this is the only offline code review system I've found.
Excellent. I hope to see more of it.
Another social issue on GitHub: you cannot use the "good first issue" tag on a public repository without being subjected to low quality drive-by PRs or AI slop automatically submitted by someone's bot.
I think the issue with centralization is still understated. I know developers who seem to struggle reading code if it's not presented by VS Code or a GitHub page. And then, why not totally capture everyone into developing just with GitHub Codespaces?
This is exactly what well-intentioned folk like to see: it's solving everyone's problems! Batteries included, nothing else is needed! Why use your own machine or software that doesn't ping into a telemetry hell-hole of data collection on a regular basis?
> To avoid this problem, I created my own bug tracker software, buggy, which is a very simple C tool that parses plain Markdown files and creates a single HTML page for each bug.
I love this. I used to be a big fan of linear (because the alternatives were dog water), but this also opened the question "why even have a seperate, disconnected tool?"
Most of my personal projects have a TODO.md somewhere with a list of things i need to work on. If people really need a frontend for bugs, it wouldn't be more than just rendering that markdown on the web.
> As it is simply plain text
Well, if your bugs can be specified clearly in plain text and plain text only, then yeah, I'd also advocate for this approach. Unfortunately, that's not really the case in any bigger software project. I need screenshots, video recordings that are 100 megs, cross-issue linking etc. I hate JIRA (of course) but it gets it right.
Even in the case of Dillo, the migrated bugs from GitHub include ZIP files (that are still hosted on GitHub): https://bug.dillo-browser.org/50/
We are in the disapora phase; there is a steady stream of these announcements, each with a different GitHub alternative. I speculate that within a few months, the communities will have settled on a single dominant one. I'm curious if it will be one of the existing ones, or something new. Perhaps a well-known company or individual will announce one; it will have good marketing, and dominate.
This has been going on for a decade, at the beginning it was projects moving to Gitlab now there's a lot of alternative projects but GitHub is still the only one that counts for discoverability. This is a very small minority of projects that move away from Github and it's way too early to declare GitHub doomed.
No different than everyone talking about the next “iPhone Killer” when someone other than Apple releases a phone. Although, I think that rhetoric has largely died down.
Different devs have different preferred ways to work and collaborate. I doubt the FOSS community will converge on a single solution. I think we’re at a point of re-decentralization, where devs will move their projects to the forge that satisfies their personal/group requirements for control, hosting jurisdiction, corporate vs community ownership, workflow, and uptime.
This is due to increasing competition in the source forge space. It’s good that different niches can be served by their preferred choice, even if it will be less convenient for devs who want to contribute a patch on a more obscure platform.
> I speculate that within a few months, the communities will have settled on a single dominant one.
The solutions on the roadmap are not centralized as GitHub. There is a real initiative to promote federation so we would not need to rely on one entity.
I love this, and hope it works out this way. Maybe another way to frame it: In 2 years, what will the "Learn Python for Beginners" tutorials direct the user towards? Maybe there will not be a consensus, but my pattern-matching brain finds one!
The settling on a dominant one does not happen - self-hosting becomes more popular.
The bigger question is whether we want a single dominant replacement, or whether it just means we'll be back in the same place in 5 years.
> Perhaps a well-known company or individual will announce one; it will have good marketing, and dominate.
Hah, exactly what we’re attempting with Tangled! Some big announcements to come fairly soon. We’re positioning ourselves to be the next social collab platform—focused solely on indies & communities.
It looks like all that they’re doing is griping over frontends and interfaces to do all the custodial work other than version control (ie., all baked-in git provisions).
How do you speculate the candidacy for email.
Isn't that pretty much GitLab? But then most people still prefer GitHub anyway.
GitLab is too heavyweight for many projects. It’s great for corporations or big organizations like GNOME, but it’s slow and difficult to administer. It has an important place in the ecosystem, but I doubt many small projects will choose it over simpler alternatives like Codeberg.
Gitlab is worse than GitHub in every way.
At least GitHub adds new features over time.
Gitlab has been removing features in favor of more expensive plans even after explicitly saying they wouldn’t do so.
> At least GitHub adds new features over time.
Not as quickly as they add anti-features, imho.
Gitlab works fine for me. Been using it at work for a few years and recently moved all my personal repos there
Gitlab is part of the reason I'm thinking along these lines: It has been around for a while, as a known, reasonably popular alternative to GitHub. So, I expected the announcement to be "We moved to GitLab", Yet, what I observe is "We moved to CodeHouse" or "We moved to Source-Base" The self-hosting here with mirrors to two one I'm not familiar with is another direction.
I think people are wary of moving to gitlab because its a similarly large platform and dont want to repeat their mistakes
gitlab has also gone full slop
I didn't know about forgejo, it looks pretty nice.
Forgejo is what codeberg runs on, which imo is an awesome alternative to github
>frontend barely works without JavaScript, ... In the past, it used to gracefully degrade without enforcing JavaScript, but now it doesn't.
And the github frontend developers are aware of these accessibility problems (via the forums and bug reports). They just don't care anymore. They just want to make the site appear to work at first glance which is why index pages are actual text in html but nothing else is.
I'd love to hear the inside story of GitHub's migration of their core product features to React.
It clearly represents a pretty seismic cultural change within the company. GitHub was my go-to example of a sophisticated application that loaded fast and didn't require JavaScript for well over a decade.
The new React stuff is sluggish even on a crazy fast computer.
My guess is that the "old guard" who made the original technical decisions all left, and since it's been almost impossible to hire a frontend engineer since ~2020 or so that wasn't a JavaScript/React-first developer the weight of industry fashion became too much to resist.
But maybe I'm wrong and they made a technical decision to go all-in on heavy JavaScript features that was reasoned out by GitHub veterans and accompanied by rock solid technical justification.
GitHub have been very transparent about their internal technical decisions in the past. I'd love to see them write about this transition.
In answer to my own question about in-depth decision making, I just found this presentation from February 2025 by seven-year GitHub veteran Joel Hawksley: https://hawksley.org/2025/02/10/lessons-from-5-years-of-ui-a...
Relevant quote:
> But beyond accessibility and availability, there is also a growing expectation of GitHub being more app-like.
> The first case of this was when we rebuilt GitHub projects. Customers were asking for features well beyond our existing feature set. More broadly, we are seeing other companies in our space innovate with more app-like experiences.
> Which has led us to adoption React. While we don’t have plans to rewrite GitHub in React, we are building most new experiences in React, especially when they are app-like.
> We made this decision a couple of years ago, and since then we’ve added about 250 React routes that serve about half of the average pages used by a given user in a week.
It then goes on to talk about how mobile is the new baseline and GitHub needed to build interfaces that felt more like mobile apps.
(Personally I think JavaScript-heavy React code is a disaster on mobile since it's so slow to load on the median (Android) device. I guess GitHub's core audience are more likely to have powerful phones?)
For contrast, gitea/forgejo use as little JavaScript as possible, and have been busy removing frontend libraries over the past year or so. For example, jquery was removed in favor of native ES6+.
Let them choke on their "app-like experience", and if you can afford it, switch over to either one. I cannot recommend it enough after using it "in production" daily for more than five years.
I honestly believe that the people involved likely already wanted to move over to React/SPAs for one reason or another, and were mostly just searching for excuses to do so - hence these kind of vague and seemingly disproportional reasons. Mobile over desktop? Whatever app-like means over performance?
Non-technical incentives steering technical decisions is more common than we'd perhaps like to admit.
github is a tool used where code is written: on desktop computers
no-one cares about the github mobile experience
microsoft making the windows 8 mistake all over again
I interact with GitHub on my mobile phone every day.
yeah and I bet three people used Windows 8 on tablets too.
I think you are wildly underestimating how common it is for people to use GitHub from a phone.
It's where I interact with notifications about new issues and PRs for one thing. I doubt I'm alone there.
Who has ever used github on mobile?
I'd like to see their logs about this.
Me, every day.
https://github.com/orgs/community/discussions/62372#discussi...
It's 1 step forward 2 steps back with this "server side rendering" framing of the issue and in practice observing Microsoft Github's behaviors. They'll temporarily enable text on the web pages of the site in response to accessibility issues then a few months later remove it on that type of page and even more others. As that thread and others I've participated in show this is a losing battle. Microsoft Github will be javascript application only in the end. Human people should consider moving their personal projects accordingly. For work, well one often has to do very distasteful and unethical things for money. And github is where the money is.
Having to enable javascript to see a website is not an accessibility problem according to WCAG.
It is a very real accessibility problem if you're using Dillo, which does not support javascript.
There's 'enabling javascript' and then there's 'requiring a javascript VM with bleeding edge features basically only found 3 browsers'.
Fixing accessibility problems won't make shareholders happy while forcing AI down our throats will.
To me, this sounds like a good change. And FWIW, I an finding I am using dillo more and more these days.
I went to gitlab from github due to Microsoft changes, my needs are very simple so far gitlab seems OK.
I also mirror just the current source on sdf.org via gopher. If gitlab causes issues this could very well become my main site.
>The most annoying problem is that the frontend barely works without JavaScript,
Not only did they spend years rewriting the frontend from Pjax to I think React? They also manage to lost customer because of it.
GitHub frontend is mostly still their own [1] Web Components based library. They use Turbo to do client side reloading. They have small islands of React based views like Projects view or reworked Pull Request review. The thing is, even if you disable JavaScript, sites still load sloow. Try it yourself. Frontend code doesn’t seem to be the bottleneck.
[1] https://github.blog/engineering/architecture-optimization/ho...
A good reason to move away from GitHub is it is from Microsoft (FAMAG; a company who kissed Trump's ring).
Sourcehut is hosted in The Netherlands, and Codeberg in Germany.
What would be nice is an aggregator site one could submit to and everyone just host it on their own internet connection, and nobody be dependent on a source for hosting their projects. Maybe something like bluesky with the AT protocol but with git repositories.
There’s Tangled[0], but I don’t have personal experience with it.
[0]: https://tangled.org/
For just text there's Usenet, Freenet, Mastodon. Though these work for more than merely text.
I suppose something like this with git and source code exists on Tor.
During the Arab Spring and Hong Kong protests, Bluetooth was used to share messages whilst the internet was cut off.
I hope you will continue maintaining a mirror in GH. Some tools like deepwiki are excellent resources to learn about a codebase when their is not much documentation going around. But these tools only support pulling from GH.
I have the exact opposite experience where I had to block multiple such "excellent resources" from my search results.
How is pulling dependent on github?
Git pulling isn't unique to github and it works over http or ssh?
A neat thing about GitHub is that every file on it can be accessed from URLs like https://raw.githubusercontent.com/simonw/llm-prices/refs/hea... which are served through a CDN with open CORS headers - which means any JavaScript application running anywhere can access them.
Demo: https://tools.simonwillison.net/cors-fetch?url=https%3A%2F%2...
It's less about pulling and more about tools like DeepWiki making the assumption that its inputs live in GitHub, so repository URLs are expected to be GH URLs as opposed to a URL to a git repository anywhere.
That being said, there's no reason for tools like it to have those constraints other than pushing users into an ecosystem they prefer (i.e. GitHub instead of other forges).