Dimitri John Ledkov: Ubuntu Livepatch service now supports over 60 different kernels
Linux kernel getting a livepatch whilst running a marathon. Generated with AI. |
Linux kernel getting a livepatch whilst running a marathon. Generated with AI. |
manifest-version: "0.1"
installations:
- install:
source: foo
dest-dir: usr/bin
"manifest-version": "0.1",
"installations": [
"install":
"source": "foo",
"dest-dir": "usr/bin",
]
So beyond me being unsatisfied with the current situation, it was also clear to me that I needed to come up with a better solution if I wanted externally provided plugins for debputy. To put a bit more perspective on what I expected from the end result:
- Most plugins would probably throw KeyError: X or ValueError style errors for quite a while. Worst case, they would end on my table because the user would have a hard time telling where debputy ends and where the plugins starts. "Best" case, I would teach debputy to say "This poor error message was brought to you by plugin foo. Go complain to them". Either way, it would be a bad user experience.
- This even assumes plugin providers would actually bother writing manifest parsing code. If it is that difficult, then just providing a custom file in debian might tempt plugin providers and that would undermine the idea of having the manifest be the sole input for debputy.
- It had to cover as many parsing errors as possible. An error case this code would handle for you, would be an error where I could ensure it sufficient degree of detail and context for the user.
- It should be type-safe / provide typing support such that IDEs/mypy could help you when you work on the parsed result.
- It had to support "normalization" of the input, such as
# User provides
- install: "foo"
# Which is normalized into:
- install:
source: "foo"
4) It must be simple to tell debputy what input you expected.
# User provides
- install: "foo"
# Which is normalized into:
- install:
source: "foo"
# Example input matching this typed dict.
#
# "source": ["foo"]
# "into": ["pkg"]
#
class InstallExamplesTargetFormat(TypedDict):
# Which source files to install (dest-dir is fixed)
sources: List[str]
# Which package(s) that should have these files installed.
into: NotRequired[List[str]]
# Example input matching this typed dict.
#
# "source": "foo"
# "into": "pkg"
#
#
class InstallExamplesManifestFormat(TypedDict):
# Note that sources here is split into source (str) vs. sources (List[str])
sources: NotRequired[List[str]]
source: NotRequired[str]
# We allow the user to write into: foo in addition to into: [foo]
into: Union[str, List[str]]
FullInstallExamplesManifestFormat = Union[
InstallExamplesManifestFormat,
List[str],
str,
]
def _handler(
normalized_form: InstallExamplesTargetFormat,
) -> InstallRule:
... # Do something with the normalized form and return an InstallRule.
concept_debputy_api.add_install_rule(
keyword="install-examples",
target_form=InstallExamplesTargetFormat,
manifest_form=FullInstallExamplesManifestFormat,
handler=_handler,
)
This prototype did not take a long (I remember it being within a day) and worked surprisingly well though with some poor error messages here and there. Now came the first challenge, adding the manifest format schema plus relevant normalization rules. The very first normalization I did was transforming into: Union[str, List[str]] into into: List[str]. At that time, source was not a separate attribute. Instead, sources was a Union[str, List[str]], so it was the only normalization I needed for all my use-cases at the time. There are two problems when writing a normalization. First is determining what the "source" type is, what the target type is and how they relate. The second is providing a runtime rule for normalizing from the manifest format into the target format. Keeping it simple, the runtime normalizer for Union[str, List[str]] -> List[str] was written as:
- Read TypedDict.__required_attributes__ and TypedDict.__optional_attributes__ to determine which attributes where present and which were required. This was used for reporting errors when the input did not match.
- Read the types of the provided TypedDict, strip the Required / NotRequired markers and use basic isinstance checks based on the resulting type for str and List[str]. Again, used for reporting errors when the input did not match.
def normalize_into_list(x: Union[str, List[str]]) -> List[str]:
return x if isinstance(x, list) else [x]
# Map from
class InstallExamplesManifestFormat(TypedDict):
# Note that sources here is split into source (str) vs. sources (List[str])
sources: NotRequired[List[str]]
source: NotRequired[str]
# We allow the user to write into: foo in addition to into: [foo]
into: Union[str, List[str]]
# ... into
class InstallExamplesTargetFormat(TypedDict):
# Which source files to install (dest-dir is fixed)
sources: List[str]
# Which package(s) that should have these files installed.
into: NotRequired[List[str]]
While working on all of this type introspection for Python, I had noted the Annotated[X, ...] type. It is basically a fake type that enables you to attach metadata into the type system. A very random example:
- How will the parser generator understand that source should be normalized and then mapped into sources?
- Once that is solved, the parser generator has to understand that while source and sources are declared as NotRequired, they are part of a exactly one of rule (since sources in the target form is Required). This mainly came down to extra book keeping and an extra layer of validation once the previous step is solved.
# For all intents and purposes, foo is a string despite all the Annotated stuff.
foo: Annotated[str, "hello world"] = "my string here"
# Map from
#
# "source": "foo" # (or "sources": ["foo"])
# "into": "pkg"
#
class InstallExamplesManifestFormat(TypedDict):
# Note that sources here is split into source (str) vs. sources (List[str])
sources: NotRequired[List[str]]
source: NotRequired[
Annotated[
str,
DebputyParseHint.target_attribute("sources")
]
]
# We allow the user to write into: foo in addition to into: [foo]
into: Union[str, List[str]]
# ... into
#
# "source": ["foo"]
# "into": ["pkg"]
#
class InstallExamplesTargetFormat(TypedDict):
# Which source files to install (dest-dir is fixed)
sources: List[str]
# Which package(s) that should have these files installed.
into: NotRequired[List[str]]
It was not super difficult given the existing infrastructure, but it did take some hours of coding and debugging. Additionally, I added a parse hint to support making the into conditional based on whether it was a single binary package. With this done, you could now write something like:
- Conversion table where the parser generator can tell that BinaryPackage requires an input of str and a callback to map from str to BinaryPackage. (That is probably lie. I think the conversion table came later, but honestly I do remember and I am not digging into the git history for this one)
- At runtime, said callback needed access to the list of known packages, so it could resolve the provided string.
# Map from
class InstallExamplesManifestFormat(TypedDict):
# Note that sources here is split into source (str) vs. sources (List[str])
sources: NotRequired[List[str]]
source: NotRequired[
Annotated[
str,
DebputyParseHint.target_attribute("sources")
]
]
# We allow the user to write into: foo in addition to into: [foo]
into: Union[BinaryPackage, List[BinaryPackage]]
# ... into
class InstallExamplesTargetFormat(TypedDict):
# Which source files to install (dest-dir is fixed)
sources: List[str]
# Which package(s) that should have these files installed.
into: NotRequired[
Annotated[
List[BinaryPackage],
DebputyParseHint.required_when_multi_binary()
]
]
In this way everybody wins. Yes, writing this parser generator code was more enjoyable than writing the ad-hoc manual parsers it replaced. :)
- The parser engine handles most of the error reporting meaning users get most of the errors in a standard format without the plugin provider having to spend any effort on it. There will be some effort in more complex cases. But the common cases are done for you.
- It is easy to provide flexibility to users while avoiding having to write code to normalize the user input into a simplified programmer oriented format.
- The parser handles mapping from basic types into higher forms for you. These days, we have high level types like FileSystemMode (either an octal or a symbolic mode), different kind of file system matches depending on whether globs should be performed, etc. These types includes their own validation and parsing rules that debputy handles for you.
- Introspection and support for providing online reference documentation. Also, debputy checks that the provided attribute documentation covers all the attributes in the manifest form. If you add a new attribute, debputy will remind you if you forget to document it as well. :)
Series: | Sherlock Holmes #1 |
Publisher: | AmazonClassics |
Copyright: | 1887 |
Printing: | February 2018 |
ISBN: | 1-5039-5525-7 |
Format: | Kindle |
Pages: | 159 |
My surprise reached a climax, however, when I found incidentally that he was ignorant of the Copernican Theory and of the composition of the Solar System. That any civilized human being in this nineteenth century should not be aware that the earth travelled round the sun appeared to be to me such an extraordinary fact that I could hardly realize it. "You appear to be astonished," he said, smiling at my expression of surprise. "Now that I do know it I shall do my best to forget it." "To forget it!" "You see," he explained, "I consider that a man's brain originally is like a little empty attic, and you have to stock it with such furniture as you chose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be useful to him gets crowded out, or at best is jumbled up with a lot of other things so that he has a difficulty in laying his hands upon it. Now the skilful workman is very careful indeed as to what he takes into his brain-attic. He will have nothing but the tools which may help him in doing his work, but of these he has a large assortment, and all in the most perfect order. It is a mistake to think that that little room has elastic walls and can distend to any extent. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones."This is directly contrary to my expectation that the best way to make leaps of deduction is to know something about a huge range of topics so that one can draw unexpected connections, particularly given the puzzle-box construction and odd details so beloved in classic mysteries. I'm now curious if Doyle stuck with this conception, and if there were any later mysteries that involved astronomy. Speaking of classic mysteries, A Study in Scarlet isn't quite one, although one can see the shape of the genre to come. Doyle does not "play fair" by the rules that have not yet been invented. Holmes at most points knows considerably more than the reader, including bits of evidence that are not described until Holmes describes them and research that Holmes does off-camera and only reveals when he wants to be dramatic. This is not the sort of story where the reader is encouraged to try to figure out the mystery before the detective. Rather, what Doyle seems to be aiming for, and what Watson attempts (unsuccessfully) as the reader surrogate, is slightly different: once Holmes makes one of his grand assertions, the reader is encouraged to guess what Holmes might have done to arrive at that conclusion. Doyle seems to want the reader to guess technique rather than outcome, while providing only vague clues in general descriptions of Holmes's behavior at a crime scene. The structure of this story is quite odd. The first part is roughly what you would expect: first-person narration from Watson, supposedly taken from his journals but not at all in the style of a journal and explicitly written for an audience. Part one concludes with Holmes capturing and dramatically announcing the name of the killer, who the reader has never heard of before. Part two then opens with... a western?
In the central portion of the great North American Continent there lies an arid and repulsive desert, which for many a long year served as a barrier against the advance of civilization. From the Sierra Nevada to Nebraska, and from the Yellowstone River in the north to the Colorado upon the south, is a region of desolation and silence. Nor is Nature always in one mood throughout the grim district. It comprises snow-capped and lofty mountains, and dark and gloomy valleys. There are swift-flowing rivers which dash through jagged ca ons; and there are enormous plains, which in winter are white with snow, and in summer are grey with the saline alkali dust. They all preserve, however, the common characteristics of barrenness, inhospitality, and misery.First, I have issues with the geography. That region contains some of the most beautiful areas on earth, and while a lot of that region is arid, describing it primarily as a repulsive desert is a bit much. Doyle's boundaries and distances are also confusing: the Yellowstone is a northeast-flowing river with its source in Wyoming, so the area between it and the Colorado does not extend to the Sierra Nevadas (or even to Utah), and it's not entirely clear to me that he realizes Nevada exists. This is probably what it's like for people who live anywhere else in the world when US authors write about their country. But second, there's no Holmes, no Watson, and not even the pretense of a transition from the detective novel that we were just reading. Doyle just launches into a random western with an omniscient narrator. It features a lean, grizzled man and an adorable child that he adopts and raises into a beautiful free spirit, who then falls in love with a wild gold-rush adventurer. This was written about 15 years before the first critically recognized western novel, so I can't blame Doyle for all the cliches here, but to a modern reader all of these characters are straight from central casting. Well, except for the villains, who are the Mormons. By that, I don't mean that the villains are Mormon. I mean Brigham Young is the on-page villain, plotting against the hero to force his adopted daughter into a Mormon harem (to use the word that Doyle uses repeatedly) and ruling Salt Lake City with an iron hand, border guards with passwords (?!), and secret police. This part of the book was wild. I was laughing out-loud at the sheer malevolent absurdity of the thirty-day countdown to marriage, which I doubt was the intended effect. We do eventually learn that this is the backstory of the murder, but we don't return to Watson and Holmes for multiple chapters. Which leads me to the other thing that surprised me: Doyle lays out this backstory, but then never has his characters comment directly on the morality of it, only the spectacle. Holmes cares only for the intellectual challenge (and for who gets credit), and Doyle sets things up so that the reader need not concern themselves with aftermath, punishment, or anything of that sort. I probably shouldn't have been surprised this does fit with the Holmes stereotype but I'm used to modern fiction where there is usually at least some effort to pass judgment on the events of the story. Doyle draws very clear villains, but is utterly silent on whether the murder is justified. Given its status in the history of literature, I'm not sorry to have read this book, but I didn't particularly enjoy it. It is very much of its time: everyone's moral character is linked directly to their physical appearance, and Doyle uses the occasional racial stereotype without a second thought. Prevailing writing styles have changed, so the prose feels long-winded and breathless. The rivalry between Holmes and the police detectives is tedious and annoying. I also find it hard to read novels from before the general absorption of techniques of emotional realism and interiority into all genres. The characters in A Study in Scarlet felt more like cartoon characters than fully-realized human beings. I have no strong opinion about the objective merits of this book in the context of its time other than to note that the sudden inserted western felt very weird. My understanding is that this is not considered one of the better Holmes stories, and Holmes gets some deeper characterization later on. Maybe I'll try another of Doyle's works someday, but for now my curiosity has been sated. Followed by The Sign of the Four. Rating: 4 out of 10
Series: | Deep Witches #1 |
Publisher: | A Girl and Her Fed Books |
Copyright: | March 2021 |
ISBN: | blackwing-war |
Format: | Kindle |
Pages: | 284 |
color_encoding
and color_range
for
the input colorspace conversion, that is not covered by this work.
In case you re not familiar with AMD shared code, what we need to do is
basically draw a map and navigate there!
We have some DRM color properties after blending, but nothing before blending
yet. But much of the hardware programming was already implemented in the AMD DC
layer, thanks to the shared code.
Still both the DRM interface and its connection to the shared code were
missing. That s when the search begins!
AMD driver-specific color pipeline:
Looking at the color capabilities of the hardware, we arrive at this initial
set of properties. The path wasn t exactly like that. We had many iterations
and discoveries until reached to this pipeline.
The Plane Degamma is our first driver-specific property before blending. It s
used to linearize the color space from encoded values to light linear values.
We can use a pre-defined transfer function or a user lookup table (in short,
LUT) to linearize the color space.
Pre-defined transfer functions for plane degamma are hardcoded curves that go
to a specific hardware block called DPP Degamma ROM. It supports the following
transfer functions: sRGB EOTF, BT.709 inverse OETF, PQ EOTF, and pure power
curves Gamma 2.2, Gamma 2.4 and Gamma 2.6.
We also have a one-dimensional LUT. This 1D LUT has four thousand ninety six
(4096) entries, the usual 1D LUT size in the DRM/KMS. It s an array of
drm_color_lut
that goes to the DPP Gamma Correction block.
We also have now a color transformation matrix (CTM) for color space
conversion.
It s a 3x4 matrix of fixed points that goes to the DPP Gamut Remap Block.
Both pre- and post-blending matrices were previously gone to the same color
block. We worked on detaching them to clear both paths.
Now each CTM goes on its own way.
Next, the HDR Multiplier. HDR Multiplier is a factor applied to the color
values of an image to increase their overall brightness.
This is useful for converting images from a standard dynamic range (SDR) to a
high dynamic range (HDR). As it can range beyond [0.0, 1.0] subsequent
transforms need to use the PQ(HDR) transfer functions.
And we need a 3D LUT. But 3D LUT has a limited number of entries in each
dimension, so we want to use it in a colorspace that is optimized for human
vision. It means in a non-linear space. To deliver it, userspace may need one
1D LUT before 3D LUT to delinearize content and another one after to linearize
content again for blending.
The pre-3D-LUT curve is called Shaper curve. Unlike Degamma TF, there are no
hardcoded curves for shaper TF, but we can use the AMD color module in the
driver to build the following shaper curves from pre-defined coefficients. The
color module combines the TF and the user LUT values into the LUT that goes to
the DPP Shaper RAM block.
Finally, our rockstar, the 3D LUT. 3D LUT is perfect for complex color
transformations and adjustments between color channels.
3D LUT is also more complex to manage and requires more computational
resources, as a consequence, its number of entries is usually limited. To
overcome this restriction, the array contains samples from the approximated
function and values between samples are estimated by tetrahedral interpolation.
AMD supports 17 and 9 as the size of a single-dimension. Blue is the outermost
dimension, red the innermost.
As mentioned, we need a post-3D-LUT curve to linearize the color space before
blending. This is done by Blend TF and LUT.
Similar to shaper TF, there are no hardcoded curves for Blend TF. The
pre-defined curves are the same as the Degamma block, but calculated by the
color module. The resulting LUT goes to the DPP Blend RAM block.
Now we have everything connected before blending. As a conflict between plane
and CRTC Degamma was inevitable, our approach doesn t accept that both are set
at the same time.
We also optimized the conversion of the framebuffer to wire encoding by adding
support to pre-defined CRTC Gamma TF.
Again, there are no hardcoded curves and TF and LUT are combined by the AMD
color module. The same types of shaper curves are supported. The resulting LUT
goes to the MPC Gamma RAM block.
Finally, we arrived in the final version of DRM/AMD driver-specific color
management pipeline. With this knowledge, you re ready to better enjoy the
rainbow treasure of AMD display hardware and the world of graphics computing.
With this work, Gamescope/Steam Deck embraces the color capabilities of the AMD
GPU. We highlight here how we map the Gamescope color pipeline to each AMD
color block.
Future works:
The search for the rainbow treasure is not over! The Linux DRM subsystem
contains many hidden treasures from different vendors. We want more complex
color transformations and adjustments available on Linux. We also want to
expose all GPU color capabilities from all hardware vendors to the Linux
userspace.
Thanks Joshua and Harry for this joint work and the Linux DRI community for all feedback and reviews.
The amazing part of this work comes in the next talk with Joshua and The Rainbow Frogs!
Any questions?
The package is orphaned and its upstream is no longer developed. It depends on gtk2, has a low popcon and no reverse dependencies.So I had little hope to see this program back in Debian. The git repository shows little activity, the last being two years ago. Interestingly, I do not quite remember what it was testing, but I do remember it to find dead pixels, confirm native resolution, and various pixel-peeping. Here's a screenshot of one of the screentest screens: Now, I think it's safe to assume this program is dead and buried, and anyways I'm running wayland now, surely there's something better? Well, no. Of course not. Someone would know about it and tell me before I go on a random coding spree in a fit of procrastination... riiight? At least, the Debconf video team didn't seem to know of any replacement. They actually suggested I just "invoke gstreamer directly" and "embrace the joy of shell scripting".
--patterns
. A list of patterns relevant to your installation is
available with the gst-inspect-1.0 videotestsrc
command.
By default, screentest
goes through all patterns. Each pattern runs
indefinitely until the you close the window, then the next pattern
starts.
You can restrict to a subset of patterns, for example this would be a
good test for dead pixels:
screentest --patterns black,white,red,green,blue
This would be a good sharpness test:
screentest --patterns pinwheel,spokes,checkers-1,checkers-2,checkers-4,checkers-8
A good generic test is the classic SMPTE color bars and is the
first in the list, but you can run only that test with:
screentest --patterns smpte
(I will mention, by the way, that as a system administrator with decades of experience, it is nearly impossible to type SMPTE without first typing SMTP and re-typing it again a few times before I get it right. I fully expect this post to have numerous typos.)Here's an example of the SMPTE pattern from Wikipedia: For multi-monitor setups,
screentest
also supports specifying which
output to use as a native resolution, with --output
. Failing that,
it will try to look at the outputs and use the first it will
find. If it fails to find anything, you can specify a resolution with
--resolution WIDTHxHEIGHT
.
I have tried to make it go full screen by default, but stumbled a bug
in Sway that crashes gst-launch
. If your Wayland compositor
supports it, you can possibly enable full screen with --sink
waylandsink fullscreen=true
. Otherwise it will create a new window
that you will have to make fullscreen yourself.
For completeness, there's also an --audio
flag that will emit the
classic "drone", a sine wave at 440Hz at 40% volume (the audiotestsrc
gstreamer plugin. And there's a --overlay-name
option to show
the pattern name, in case you get lost and want to start with one of
them again.
gst-launch
.
There might be some additional patterns that could be useful, but I
think those are better left to gstreamer. I, for example, am somewhat
nostalgic of the Philips circle pattern that used to play for TV
stations that were off-air in my area. But that, in my opinion, would
be better added to the gstreamer plugin than into a separate thing.
The script shows which command is being ran, so it's a good
introduction to gstreamer pipelines. Advanced users (and the video
team) will possibly not need screentest
and will design their own
pipelines with their own tools.
I've previously worked with ffmpeg
pipelines (in another such
procrastinated coding spree, video-proxy-magic), and I found
gstreamer more intuitive, even though it might be slightly less
powerful.
In retrospect, I should probably have picked a new name, to avoid
crashing the namespace already used by the project, which is now on
GitHub. Who knows, it might come back to life after this blog
post; it would not be the first time.
For now, the project lives along side the rest of my scripts
collection but if there's sufficient interest, I might move it to
its own git repositories. Comments, feedback, contributions are as
usual welcome. And naturally, if you know something better for this
kind of stuff, I'm happy to learn more about your favorite tool!
So now I have finally found something to test my projector, which will
likely confirm what I've already known all along: that it's kind
of a piece of crap and I need to get a proper one.
[ ] In March 2023, Ken gave the closing keynote [and] during the Q&A session, someone jokingly asked about the Turing award lecture, specifically can you tell us right now whether you have a backdoor into every copy of gcc and Linux still today?Although Ken reveals (or at least claims!) that he has no such backdoor, he does admit that he has the actual code which Russ requests and subsequently dissects in great but accessible detail.
Arch Linux packages become reproducible a median of 30 days quicker when compared to Debian packages, while Debian packages remain reproducible for a median of 68 days longer once fixed.A full PDF of their paper is available online, as are many other interesting papers on MCIS publication page.
nixos-minimal
image that is used to install NixOS. In their post, Arnout details what exactly can be reproduced, and even includes some of the history of this endeavour:
You may remember a 2021 announcement that the minimal ISO was 100% reproducible. While back then we successfully tested that all packages that were needed to build the ISO were individually reproducible, actually rebuilding the ISO still introduced differences. This was due to some remaining problems in the hydra cache and the way the ISO was created. By the time we fixed those, regressions had popped up (notably an upstream problem in Python 3.10), and it isn t until this week that we were back to having everything reproducible and being able to validate the complete chain.Congratulations to NixOS team for reaching this important milestone! Discussion about this announcement can be found underneath the post itself, as well as on Hacker News.
arm64
hardware from Codethink
Long-time sponsor of the project, Codethink, have generously replaced our old Moonshot-Slides , which they have generously hosted since 2016 with new KVM-based arm64
hardware. Holger Levsen integrated these new nodes to the Reproducible Builds continuous integration framework.
ext4
filesystem images. [ ]
SOURCE_DATE_EPOCH
environment variable in order to close bug #1034422. In addition, 8 reviews of packages were added, 74 were updated and 56 were removed this month, all adding to our knowledge about identified issues.
Bernhard M. Wiedemann published another monthly report about reproducibility within openSUSE.
edje_cc
(race condition)elasticsearch
(build failure)erlang-retest
(embedded .zip
timestamp)fdo-client
(embeds private keys)fftw3
(random ordering)gsoap
(date issue)gutenprint
(date)hub/golang
(embeds random build path)Hyprland
(filesystem issue)kitty
(sort-related issue, .tar
file embeds modification time)libpinyin
(ASLR)maildir-utils
(date embedded in copyright)mame
(order-related issue)mingw32-binutils
& mingw64-binutils
(date)MooseX
(date from perl-MooseX-App)occt
(sorting issue)openblas
(embeds CPU count)OpenRGB
(corruption-related issue)python-numpy
(random file names)python-pandas
(FTBFS)python-quantities
(date)python3-pyside2
(order)qemu
(date and Sphinx issue)qpid
(sorting problem)rakudo
(filesystem ordering issue)SLOF
(date-related issue)spack
(CPU counting issue)xemacs-packages
(date-related issue)file -i
returns text/plain
, fallback to comparing as a text file. This was originally filed as Debian bug #1053668) by Niels Thykier. [ ] This was then uploaded to Debian (and elsewhere) as version 251
.
#debian-reproducible-changes
IRC channel. [ ][ ][ ]systemd-oomd
on all Debian bookworm nodes (re. Debian bug #1052257). [ ]schroots
. [ ]arm64
machines from Codethink. [ ][ ][ ][ ][ ][ ]#reproducible-builds
on irc.oftc.net
.
rb-general@lists.reproducible-builds.org
systemd
unit files if
dh_installsystemd
and systemd.pc
change their installation targets.
Unfortunately, doing so makes some packages FTBFS and therefore
patches have been filed.
The analysis tool, dumat
, has been enhanced to better understand
which upgrade scenarios are considered supported
to reduce false positive bug filings and gained a mode for
local operation on a .changes
file
meant for inclusion in salsa-ci. The filing of bugs from dumat
is still
manual to improve the quality of reports.
Since September, the moratorium
has been lifted.
debvm
enabling its
use in autopkgtests
.autopkgtest-build-qemu
. It is powered by mmdebstrap
, therefore
unprivileged, EFI-only and will soon be
included in mmdebstrap.Series: | Fall Revolution #3 |
Publisher: | Tor |
Copyright: | 1998 |
Printing: | August 2000 |
ISBN: | 0-8125-6858-3 |
Format: | Mass market |
Pages: | 305 |
Life is a process of breaking down and using other matter, and if need be, other life. Therefore, life is aggression, and successful life is successful aggression. Life is the scum of matter, and people are the scum of life. There is nothing but matter, forces, space and time, which together make power. Nothing matters, except what matters to you. Might makes right, and power makes freedom. You are free to do whatever is in your power, and if you want to survive and thrive you had better do whatever is in your interests. If your interests conflict with those of others, let the others pit their power against yours, everyone for theirselves. If your interests coincide with those of others, let them work together with you, and against the rest. We are what we eat, and we eat everything. All that you really value, and the goodness and truth and beauty of life, have their roots in this apparently barren soil. This is the true knowledge. We had founded our idealism on the most nihilistic implications of science, our socialism on crass self-interest, our peace on our capacity for mutual destruction, and our liberty on determinism. We had replaced morality with convention, bravery with safety, frugality with plenty, philosophy with science, stoicism with anaesthetics and piety with immortality. The universal acid of the true knowledge had burned away a world of words, and exposed a universe of things. Things we could use.This is certainly something that some people will believe, particularly cynical college students who love political theory, feeling smarter than other people, and calling their pet theories things like "the true knowledge." It is not even remotely believable as the governing philosophy of a solar confederation. The point of government for the average person in human society is to create and enforce predictable mutual rules that one can use as a basis for planning and habits, allowing you to not think about politics all the time. People who adore thinking about politics have great difficulty understanding how important it is to everyone else to have ignorable government. Constantly testing your power against other coalitions is a sport, not a governing philosophy. Given the implication that this testing is through violence or the threat of violence, it beggars belief that any large number of people would tolerate that type of instability for an extended period of time. Ellen is fully committed to the true knowledge. MacLeod likely is not; I don't think this represents the philosophy of the author. But the primary political conflict in this novel famous for being political science fiction is between the above variation of anarchy and an anarchocapitalist society, neither of which are believable as stable political systems for large numbers of people. This is a bit like seeking out a series because you were told it was about a great clash of European monarchies and discovering it was about a fight between Liberland and Sealand. It becomes hard to take the rest of the book seriously. I do realize that one point of political science fiction is to play with strange political ideas, similar to how science fiction plays with often-implausible science ideas. But those ideas need some contact with human nature. If you're going to tell me that the key to clawing society back from a world-wide catastrophic descent into chaos is to discard literally every social system used to create predictability and order, you had better be describing aliens, because that's not how humans work. The rest of the book is better. I am untangling a lot of backstory for the above synopsis, which in the book comes in dribs and drabs, but piecing that together is good fun. The plot is far more straightforward than the previous two books in the series: there is a clear enemy, a clear goal, and Ellen goes from point A to point B in a comprehensible way with enough twists to keep it interesting. The core moral conflict of the book is that Ellen is an anti-AI fanatic to the point that she considers anyone other than non-uploaded humans to be an existential threat. MacLeod gives the reader both reasons to believe Ellen is right and reasons to believe she's wrong, which maintains an interesting moral tension. One thing that MacLeod is very good at is what Bob Shaw called "wee thinky bits." I think my favorite in this book is the computer technology used by the Cassini Division, who have spent a century in close combat with inimical AI capable of infecting any digital computer system with tailored viruses. As a result, their computers are mechanical non-Von-Neumann machines, but mechanical with all the technology of a highly-advanced 24th century civilization with nanometer-scale manufacturing technology. It's a great mental image and a lot of fun to think about. This is the only science fiction novel that I can think of that has a hard-takeoff singularity that nonetheless is successfully resisted and fought to a stand-still by unmodified humanity. Most writers who were interested in the singularity idea treated it as either a near-total transformation leaving only remnants or as something that had to be stopped before it started. MacLeod realizes that there's no reason to believe a post-singularity form of life would be either uniform in intent or free from its own baffling sudden collapses and reversals, which can be exploited by humans. It makes for a much better story. The sociology of this book is difficult to swallow, but the characterization is significantly better than the previous books of the series and the plot is much tighter. I was too annoyed by the political science to fully enjoy it, but that may be partly the fault of my expectations coming in. If you like chewy, idea-filled science fiction with a lot of unexplained world-building that you have to puzzle out as you go, you may enjoy this, although unfortunately I think you need to read at least The Stone Canal first. The ending was a bit unsatisfying, but even that includes some neat science fiction ideas. Followed by The Sky Road, although I understand it is not a straightforward sequel. Rating: 6 out of 10
IOPS | Bandwidth | |
4k random writes | 19.3k | 75.6 MiB/s |
4k random reads | 36.1k | 141 MiB/s |
Sequential writes | 2300 MiB/s | |
Sequential reads | 3800 MiB/s |
IOPS | Bandwidth | |
4k random writes | 16k | ? |
4k random reads | 90k | ? |
Sequential writes | 280 MiB/s | |
Sequential reads | 560 MiB/s |
IOPS | Bandwidth | |
4k random writes | 430 | 1.7 MiB/s |
4k random reads | 8006 | 32 MiB/s |
Sequential writes | 311 MiB/s | |
Sequential reads | 566 MiB/s |
"Debian 30 years of collective intelligence" -Maqsuel Maqson Brazil
The cake is there. :) Honorary Debian Developers: Buzz, Jessie, and Woody welcome guests to this amazing party. Sao Carlos, state of Sao Paulo, Brazil Stickers, and Fliers, and Laptops, oh my! Belo Horizonte, Brazil Bras lia, Brazil Bras lia, Brazil Mexico 30 a os! A quick Selfie We do not encourage beverages on computing hardware, but this one is okay by us. Germany
The German Delegation is also looking for this dog who footed the bill for the party, then left mysteriously.
We brought the party back inside at CCCamp Belgium
Cake and Diversity in Belgium El Salvador
Food and Fellowship in El Salvador South Africa
Debian is also very delicious!
All smiles waiting to eat the cake Reports Debian Day 30 years in Macei - Brazil Debian Day 30 years in S o Carlos - Brazil Debian Day 30 years in Pouso Alegre - Brazil Debian Day 30 years in Belo Horizonte - Brazil Debian Day 30 years in Curitiba - Brazil Debian Day 30 years in Bras lia - Brazil Debian Day 30 years online in Brazil Articles & Blogs Happy Debian Day - going 30 years strong - Liam Dawe Debian Turns 30 Years Old, Happy Birthday! - Marius Nestor 30 Years of Stability, Security, and Freedom: Celebrating Debian s Birthday - Bobby Borisov Happy 30th Birthday, Debian! - Claudio Kuenzier Debian is 30 and Sgt Pepper Is at Least Ninetysomething - Christine Hall Debian turns 30! -Corbet Thirty years of Debian! - Lennart Hengstmengel Debian marks three decades as 'Universal Operating System' - Sam Varghese Debian Linux Celebrates 30 Years Milestone - Joshua James 30 years on, Debian is at the heart of the world's most successful Linux distros - Liam Proven Looking Back on 30 Years of Debian - Maya Posch Cheers to 30 Years of Debian: A Journey of Open Source Excellence - arindam Discussions and Social Media Debian Celebrates 30 Years - Source: News YCombinator Brand-new Linux release, which I'm calling the Debian ... Source: News YCombinator Comment: Congrats @debian !!! Happy Birthday! Thank you for becoming a cornerstone of the #opensource world. Here's to decades of collaboration, stability & #software #freedom -openSUSELinux via X (formerly Twitter) Comment: Today we #celebrate the 30th birthday of #Debian, one of the largest and most important cornerstones of the #opensourcecommunity. For this we would like to thank you very much and wish you the best for the next 30 years! Source: X (Formerly Twitter -TUXEDOComputers via X (formerly Twitter) Happy Debian Day! - Source: Reddit.com Video The History of Debian The Beginning - Source: Linux User Space Debian Celebrates 30 years -Source: Lobste.rs Video Debian At 30 and No More Distro Hopping! - LWDW388 - Source: LinuxGameCast Debian Celebrates 30 years! - Source: Debian User Forums Debian Celebrates 30 years! - Source: Linux.org
Series: | Legends & Lattes #1 |
Publisher: | Tor |
Copyright: | 2022 |
ISBN: | 1-250-88609-0 |
Format: | Kindle |
Pages: | 293 |
Next.