I really don’t get why I should use anything else than dd
One of the goals of the new GNOME project handbook is to provide effective guidelines for contributors. Most of the guidelines are based on recommendations that GNOME already had, which were then improved and updated. These improvements were based on input from others in the project, as well as by drawing on recommendations from elsewhere.
The best example of this effort was around issue management. Before the handbook, GNOME’s issue management guidelines were seriously out of date, and were incomplete in a number of areas. Now we have shiny new issue management guidelines which are full of good advice and wisdom!
The state of our issue trackers matters. An issue tracker with thousands of open issues is intimidating to a new contributor. Likewise, lots of issues without a clear status or resolution makes it difficult for potential contributors to know what to do. My hope is that, with effective issue management guidelines, GNOME can improve the overall state of its issue trackers.
So what magic sauce does the handbook recommend to turn an out of control and burdensome issue tracker into a source of calm and delight, I hear you ask? The formula is fairly simple:
- Review all incoming issues, and regularly conduct reviews of old issues, in order to weed out reports which are ambiguous, obsolete, duplicates, and so on
- Close issues which haven’t seen activity in over a year
- Apply the “needs design” and “needs info” labels as needed
- Close issues that have been labelled “need info” for 6 weeks
- Issues labelled “needs design” get closed after 1 year of inactivity, like any other
- Recruit contributors to help with issue management
To some readers this is probably controversial advice, and likely conflicts with their existing practice. However, there’s nothing new about these issue management procedures. The current incarnation has been in place since 2009, and some aspects of them are even older. Also, personally speaking, I’m of the view that effective issue management requires taking a strong line (being strong doesn’t mean being impolite, I should add – quite the opposite). From a project perspective, it is more important to keep the issue tracker focused than it is to maintain a database of every single tiny flaw in its software.
The guidelines definitely need some more work. There will undoubtedly be some cases where an issue needs to be kept open despite it being untouched for a year, for example, and we should figure out how to reflect that in the guidelines. I also feel that the existing guidelines could be simplified, to make them easier to read and consume.
I’d be really interested to hear what changes people think are necessary. It is important for the guidelines to be something that maintainers feel that they can realistically implement. The guidelines are not set in stone.
That said, it would also be awesome if more maintainers were to put the current issue management guidelines into practice in their modules. I do think that they represent a good way to get control of an issue tracker, and this could be a really powerful way for us to make GNOME more approachable to new contributors.
Dear Tumbleweed users and hackers,
Last week, there was a public holiday on Thursday in some parts of the world (Ascension Day). Unsurprisingly, many devs, including myself and Ana, took Friday off to enjoy a longer weekend (and I can tell you: the weather was fantastic). As a result, I have to span two weeks of changes to Tumbleweed here once again. We have published 12 snapshots since my last review (0502…0515, snapshots 0504 and 0513 were not built due to weekends)
The most relevant changes delivered as part of those snapshots were:
- Mozilla Firefox 125.0.3
- LibreOffice 24.2.3.2
- GNOME 46.1
- GIMP 2.10.38
- LLVM 18.1.5
- GCC 14.1
- KDE Frameworks 6.2.0
- PHP 8.3.7
- PostgreSQL 16.3
- Systemd 255.5 & 255.6
- Linux kernel 6.8.9 (with linux-glibc-devel already prepared at 6.9)
- Ruby 3.3.1
- QEmu 8.2.3
- util-linux 2.40.1
Snapshot 0515 contained an openssh update, that mistakenly recommended installation of the subpackage openssh-server-config-rootlogin; this package has existed since the default configuration of openSSH was changed to not permit root login anymore, so admins could easily switch it back on. Due to an error, this had been triggered for automatic installation. This has since been corrected and a version of openssh-server was published to the update channel, which is NOT recommended. Please check your installation and remove the package again, should it be installed and you don’t need it (we can’t auto-remove it without breaking users that explicitly wanted it)
The following things are known to be worked on at the moment and are reaching you in some upcoming snapshot:
- chkstat package being renamed to permctl
- Rust 1.78
- Mesa 24.0.7
- Linux kernel 6.9.1
- Ninja 1.12
- dbus-broker: some networking issue after upgrades left to work out
- GCC 14: phase 2: use gcc14 as the default compiler – lots of help needed: https://build.opensuse.org/project/show/openSUSE:Factory:Staging:Gcc7
What’s happened?
The Linux kernel project has become its own CVE Numbering Authority (CNA) with two very notable features:
- CVE identifiers will only be assigned after a fix is already available and in a release; and
- the project will err on the side of caution, and assign CVEs to all fixes.
This means each new kernel release will contain a lot of CVE fixes.
So what?
This could contribute to a significant change in behaviour for commercial software vendors.
The kernel project has long advocated updating to the latest stable release in order to benefit from fixes, including security patches. They’re not the only ones: Google has analysed this topic and Codethink talks extensively about creating software with Long Term Maintainability baked in.
But alas, a general shift to this mentality appears to allude us: the prevalent attitude amongst the majority of commercial software products is still very much “ship and forget”.
Consider the typical pattern: SoC vendors base their BSP on an old and stable Linux distribution. Bespoke development occurs on top of this, and some time later, a product is released to market. By this point, the Linux version is out of date, quite likely unsupported and almost certainly vulnerable from a security perspective.
Now, fair enough, upgrading your kernel is non-trivial: it needs to be carefully thought through, requires extensive testing, and often careful planning to ensure collaboration between different parties, especially if you have dependencies on vendor blobs or other proprietary components. Clearly, this kind of thing needs to be thought about from day one of a new project. Sadly, in practice, in a lot of cases, upgrading simply isn’t even planned for.
And now?
With the Linux kernel project becoming a CNA, we’ll now have a situation where every new kernel release highlights the scale of how far behind mainline these products are, and by implication how exposed to security vulnerabilities the software is.
The result should be increased pressure on vendors to upgrade.
With this, plus the recent surge in regulations around keeping software up to date (see the CRA, UNECE R155 and R156), we may start to see a genuine movement towards software being designed to be properly maintained and updated, ie, “ship and remember” or Long Term Maintainability. Let’s hope so.
What else?
Well, the Linux kernel is just one project. There are countless other FOSS projects which are depended on by almost all commercial projects, and they may also be interested in becoming their own CNA.
This would further increase the visibility of the problem, and apply a renewed focus on the criticality of releasing software products with plans to upgrade built in from the start.
If you would like to learn more about CNAs or Codethink’s Long Term Maintainability approach, reach out via sales@codethink.co.uk.
If I understood you correctly ShareDrop should fix your problem, there you can “add” someone from a different network via QR-code
Linux Distribution (Distro) and Desktop Environment (DE). Not sure why the commenter above expected you to use Linux though
Sooo just like any regular graveyard?
Okay MkvToolnix isn’t the right tool for that afaik, but ffmpeg is and it sucks there isn’t a GUI for it that’s as powerful as the cli which is what I use, luckily you can find a lot of help online,
Apart from that I’m afraid i can’t help much, but good luck with your search!
Are you trying to concat streams or just to remux? For remuxing as roawre said there’s also MkvToolNix, which works great (and it has a gui don’t worry)
ffmpeg, if you need a gui use handbreak
A Br*tish woman