107 stories
·
2 followers

Saturday Morning Breakfast Cereal - Enginomics

3 Comments and 10 Shares


Click here to go see the bonus panel!

Hovertext:
The second step is to get the cartoonists to leave town.


Today's News:
Read the whole story
thcipriani
33 days ago
reply
popular
71 days ago
reply
Share this story
Delete
3 public comments
ReadLots
71 days ago
reply
Wow, nailed it.
jlvanderzwan
71 days ago
reply
Sounds great, just make sure that moving there is a one-way ticket. In a few years Blockchain City will have collapsed into an "Escape from New York" like hellscape full while the rest of the world is finally freed of those damn ugly apes.
Lythimus
71 days ago
reply
Re: Making cartoonists leave town. Just make it so the coffeeshops ONLY accept crypto.

Finally got rid of a/ and b/ in git diff outputs!

2 Shares

You know how there are these little annoyances that are just mild enough so that you do nothing about it?

In the world of open source there’s always this notion of “if you want something to be different, the code is there, you can change it”, but most often this is not practical: I would never go about carrying a patched version of Git with me to every machine I work on just because of the annoying `a/` and `b/` prefixes that show up on Git diffs.

But those tiny prefixes always made me unable to select and paste a filename with a double-click and a middle-click on the terminal.

Today, after who knows how many years, I decided to make a search about it — “I can’t be the only one annoyed by this, right?” — and lo and behold: someone did ask about this on StackOverflow, and there is a global configuration to disable those prefixes:

git config --global diff.noprefix true

And just like that, this annoyance is gone!

Read the whole story
thcipriani
64 days ago
reply
brennen
80 days ago
reply
Boulder, CO
Share this story
Delete

A belated writeup of CVE-2022-28201 in MediaWiki

1 Share

In December 2021, I discovered CVE-2022-28201, which is that it's possible to get MediaWiki's Title::newMainPage() to go into infinite recursion. More specifically, if the local interwikis feature is configured (not used by default, but enabled on Wikimedia wikis), any on-wiki administrator could fully brick the wiki by editing the [[MediaWiki:Mainpage]] wiki page in a malicious manner. It would require someone with sysadmin access to recover, either by adjusting site configuration or manually editing the database.

In this post I'll explain the vulnerability in more detail, how Rust helped me discover it, and a better way to fix it long-term.

The vulnerability

At the heart of this vulnerability is Title::newMainPage(). The function, before my patch, is as follows (link):

public static function newMainPage( MessageLocalizer $localizer = null ) {
    if ( $localizer ) {
        $msg = $localizer->msg( 'mainpage' );
    } else {
        $msg = wfMessage( 'mainpage' );
    }
    $title = self::newFromText( $msg->inContentLanguage()->text() );
    // Every page renders at least one link to the Main Page (e.g. sidebar).
    // If the localised value is invalid, don't produce fatal errors that
    // would make the wiki inaccessible (and hard to fix the invalid message).
    // Gracefully fallback...
    if ( !$title ) {
        $title = self::newFromText( 'Main Page' );
    }
    return $title;
}

It gets the contents of the "mainpage" message (editable on-wiki at MediaWiki:Mainpage), parses the contents as a page title and returns it. As the comment indicates, it is called on every page view and as a result has a built-in fallback if the configured main page value is invalid for whatever reason.

Now, let's look at how interwiki links work. Normal interwiki links are pretty simple, they take the form of [[prefix:Title]], where the prefix is the interwiki name of a foreign site. In the default interwiki map, "wikipedia" points to https://en.wikipedia.org/wiki/$1. There's no requirement that the interwiki target even be a wiki, for example [[google:search term]] is a supported prefix and link.

And if you type in [[wikipedia:]], you'll get a link to https://en.wikipedia.org/wiki/, which redirects to the Main Page. Nice!

Local interwiki links are a bonus feature on top of this to make sharing of content across multiple wikis easier. A local interwiki is one that maps to the wiki we're currently on. For example, you could type [[wikipedia:Foo]] on the English Wikipedia and it would be the same as just typing in [[Foo]].

So now what if you're on English Wikipedia and type in [[wikipedia:]]? Naively that would be the same as typing [[]], which is not a valid link.

So in c815f959d6b27 (first included in MediaWiki 1.24), it was implemented to have a link like [[wikipedia:]] (where the prefix is a local interwiki) resolve to the main page explicitly. This seems like entirely logical behavior and achieves the goals of local interwiki links - to make it work the same, regardless of which wiki it's on.

Except it now means that when trying to parse a title, the answer might end up being "whatever the main page is". And if we're trying to parse the "mainpage" message to discover where the main page is? Boom, infinite recursion.

All you have to do is edit "MediaWiki:Mainpage" on your wiki to be something like localinterwiki: and your wiki is mostly hosed, requiring someone to either de-configure that local interwiki or manually edit that message via the database to recover it.

The patch I implemented was pretty simple, just add a recursion guard with a hardcoded fallback:

    public static function newMainPage( MessageLocalizer $localizer = null ) {
+       static $recursionGuard = false;
+       if ( $recursionGuard ) {
+           // Somehow parsing the message contents has fallen back to the
+           // main page (bare local interwiki), so use the hardcoded
+           // fallback (T297571).
+           return self::newFromText( 'Main Page' );
+       }
        if ( $localizer ) {
            $msg = $localizer->msg( 'mainpage' );
        } else {
            $msg = wfMessage( 'mainpage' );
        }

+       $recursionGuard = true;
        $title = self::newFromText( $msg->inContentLanguage()->text() );
+       $recursionGuard = false;

        // Every page renders at least one link to the Main Page (e.g. sidebar).
        // If the localised value is invalid, don't produce fatal errors that

Discovery

I was mostly exaggerating when I said Rust helped me discover this bug. I previously blogged about writing a MediaWiki title parser in Rust, and it was while working on that I read the title parsing code in MediaWiki enough times to discover this flaw.

A better fix

I do think that long-term, we have better options to fix this.

There's a new, somewhat experimental, configuration option called $wgMainPageIsDomainRoot. The idea is that rather than serve the main page from /wiki/Main_Page, it would just be served from /. Conveniently, this would mean that it doesn't actually matter what the name of the main page is, since we'd just have to link to the domain root.

There is an open request for comment to enable such functionality on Wikimedia sites. It would be a small performance win, give everyone cleaner URLs, and possibly break everything that expects https://en.wikipedia.org/ to return a HTTP 301 redirect, like it has for the past 20+ years. Should be fun!

Timeline

Acknowledgements

Thank you to Scott Bassett of the Wikimedia Security team for reviewing and deploying my patch, and Reedy for backporting and performing the security release.

Read the whole story
thcipriani
88 days ago
reply
Share this story
Delete

A proposal for a new amateur radio net

1 Share

This is a proposal for a new type of amateur radio “net.” A “net” in ham radio jargon, is basically a meeting on the radio, usually on a repeater when using VHF/UHF or direct simplex when using HF voice or CW. There is a generally accepted protocol for these meetings which I will briefly summarize and I am proposing we, as a amateur community, try a new format.

The existing format

Let’s consider the problem of getting 10 people to talk to each other on the radio and then we’ll understand why the net has a certain way of doing things. The first problem the net solves is a classic communication protocol question: when – when and where do I send my traffic? The nets happen generally at fixed dates and times and the frequencies/repeaters are “well known.” Just like HTTP etc… occur on “well known” ports. You can send HTTP traffic on port 123 for example, but you might not make your NTP servers happy.

Ok, so now we have 10 people at the right time and frequency, who talks first? If all 10 people keyed up at the same time everyone signals would be mangled, this is because on one frequency it is a shared communication medium. Early computer networks had this problem as well and I’m sure you’ve noticed now there is an Ethernet cable for every device that goes into a switch. This effectively eliminates the possibility of collision.

On the air, we could do something where every participant is assigned their own frequency. As hams, we certainly have plenty of spectrum to go around. A single net control operator could then “packetize” (summarize) the traffic and then announce it on a common frequency. On HF this is done for example if a station can’t transmit to the net control, his traffic could be relayed from another station. Local repeaters don’t do this because there is generally one repeater so it’s inherently shared.

Which leads me the biggest issue with the current protocol – most of the net is just checking hams into the net. Which, there are various ways to make this run smoother, but at the expensive of time. For example, sometimes it’s split up A-K or something but now we must _serially_ take everyone’s callsign. So, for the Net Control Station (NCS), this person is very busy! He must hear all the callsigns, if two people “double” he must ask each person, etc… Meanwhile, every other ham is just sitting there, tied to their radio, either waiting to check in, or just listening to two people struggling to check in.

Proposal

My proposal is a simple improvement on this: check into the net _via the internet_. If you don’t like the internet in your ham radio I hear you, but the community seems very interested in these hotspots where the RF traffic travels about one meter to a wifi connected device, so I think there is some interest in the internet among hams.

The protocol / script can be changed to something like the following:

Welcome to the Internet is good for you Net. This is a directed Net. If you are able to, please go to www.<website>.xyz and check into this net. It saves time and is more accurate.

I will now read the callsigns of those who checked in online: etc…

Hams online could listen to the audio stream perhaps via HLS or other streaming technology and interact with the net via chat. Net control can relay this via RF and acknowledge comments.

Basically, net control could act more like a YouTube streamer – handling RF/chat messages while talking over RF AND streaming to some site.



Read the whole story
thcipriani
89 days ago
reply
Share this story
Delete

Monday, June 27, 2022 - aphoristic noodling

2 Shares

Monday, June 27, 2022

aphoristic noodling

I read this post by Baldur Bjarnason, listing "Everything I’ve learned about web development in the almost twenty-five years I’ve been practising", and this followup, which says:

Some of the aphorisms ended up not-so-pithy, but it was overall a fun little experiment that I recommend: note down everything relevant about the craft that you can think of over the space of a week.

I thought about this, and then I thought: Ok, what exactly is my craft? I do computer shit. So I started a list about that, challenging myself to be descriptive about things and not veer too far into pure advice.

A year or so passed, and I noticed this post was still sitting in my "work in progress" directory. I tried picking it back up and noticed how much overlap it would have with other posts like these:

This style of writing is basically catnip to people like me, whether it's of much use to anyone else or not. This post ultimately felt like a dead end, because instead of a blog post, it really wants to be some long document where I collect all sorts of aphorisms, pithy quotes, eponymous laws, and so forth about technical work and maybe just work generally. Maybe I'll start that document one of these days.

Anyway, that very partial and uneven list:

  1. Caching is hard to think about and breaks often.
  2. Cleverness in code is generally a sign of danger.
  3. Business ruins everything.
  4. Some forms of interoperability are a trap.
  5. Bad ideas aren't limited to bad people.
  6. Good people aren't limited to good ideas.
  7. An aesthetic is not an ethic.
  8. The customer is usually wrong.
  9. If it's written in:
    • C: It'll work, but I should remember there's a buffer overflow or something.
    • PHP: It'll probably work, but there's an SQL injection vulnerability somewhere and the cool kids will be shitty about it being PHP.
    • Python: 50/50 whether it'll just barf stack traces into my terminal for non-obvious reasons.
    • Ruby: Decent chance I'll wind up reading the source code and cursing at clever Ruby programmers.
    • Haskell: It works, but I'm not smart enough to understand it.
    • Rust: Probably works, if they finished writing it. I'm not smart enough to understand the code.
    • Go: Total crapshoot, but either way I bet the CLI has a bunch of infuriatingly nested subcommands.
    • JavaScript: Life is too short to deal with whatever package management and runtime I'm supposed to use for this now.
    • Java: If I have to find out it's Java, I'm probably in trouble.
  10. Lightweight markup languages are fundamentally in tension with the range of structures that their users will inevitably want to express.
  11. Design, marketing, and management are all real undertakings, but they are also aggressively self-reproducing ideological systems and political projects.
  12. Environments within which small tools can be combined to operate on simple abstractions are powerful. An environment might be what you think of as an operating system, a programming language, a database, or an application. All else being equal, the ones that can bridge to other environments are more powerful.
  13. There are few abstractions in computing more stable than filesystems, standard IO, text files, and the shell. Boring relational databases aren't too far behind, but the barriers to entry and data transfer are higher.
  14. Technology is at least as fashion-oriented as the sartorial choices of highschoolers, actors, and musicians. Changes are driven as much by a desire for difference from the perceived status quo as anything else.
  15. Technical politics are also organizational, labor, and identity politics. The currents of power they involve are illegible without taking those factors into account.
  16. There's no guarantee that your technical preferences will match up with the ideas, people, or power structures you find agreeable in other domains. (Or vice versa.)

tags: topics/technical, topics/work

p1k3 / 2022 / 6 / 27

Read the whole story
thcipriani
97 days ago
reply
greggrossmeier
98 days ago
reply
Ojai, CA, US
Share this story
Delete

Sunday, May 29, 2022

1 Share

Sunday, May 29, 2022

One earlier this month from Tyler on notebooks and paper notes.

This was a reminder that I’d been meaning to update notes on notes with the current shape of my system. My habits haven’t changed drastically in three years, but I’ve made some extensions worth describing. (In particular, I now make heavy use of the tagged log format I wrote about last year. In turn, that’s shown me some things that could be better.)

On a meta level, that document is still mostly boring technical specifics. I’d like it to include more of the why of things, the stuff I’ve come to realize after years of overthinking.

p1k3 / 2022 / 5 / 29

Read the whole story
thcipriani
126 days ago
reply
Share this story
Delete
Next Page of Stories