Search Results: "dalla"

8 July 2024

Russ Allbery: Review: Beyond Control

Review: Beyond Control, by Kit Rocha
Series: Beyond #2
Publisher: Kit Rocha
Copyright: December 2013
ASIN: B00GIA4GN8
Format: Kindle
Pages: 364
Beyond Control is science fiction erotica (dystopian erotic romance, per the marketing) and a direct sequel to Beyond Shame. These books shift protagonists with each volume and enough of the world background is explained that you could start here, but there are significant spoilers for the previous book. I read this book as part of the Beyond Series Bundle (Books 1-3), which is what the sidebar information is for. This is one of those reviews that I write because I'm stubborn about reviewing all the books I read, not because it's likely to be useful to anyone. There are also considerably more spoilers for the shape of the story than I normally include, so be warned. The Beyond series is erotica. Specifically, so far, consensual BDSM erotica with bisexuality but otherwise typical gender stereotypes. The authors (Kit Rocha is a pen name for Donna Herren and Bree Bridges) are women, so it's more female gaze than male gaze, but by erotica I don't mean romance with an above-average number of steamy scenes. I mean it felt like half the book by page count was descriptions of sex. This review is rather pointless because, one, I'm not going to review the sex that's the main point of the book, and two, I skimmed all the sex and read it for the story because I'm weird. Beyond Shame got me interested in these absurdly horny people and their post-apocalyptic survival struggles in the outskirts of a city run by a religious surveillance state, and I wanted to find out what happened next. Besides, this book promised to focus on my favorite character from the first novel, Lex, and I wanted to read more about her. Beyond Control uses a series pattern that I understand is common in romance but which is not often seen in SFF (my usual genre): each book focuses on a new couple adjacent to the previous couple, while the happily ever after of the previous couple plays out in the background. In this case, it also teases the protagonists of the next book. I can see why romance uses this structure: it's an excuse to provide satisfying interludes for the reader. In between Lex and Dallas's current relationship problems, one gets to enjoy how well everything worked out for Noelle and how much she's grown. In Beyond Shame, Lex was the sort-of partner of Dallas O'Kane, the leader of the street gang that is running Sector Four. (Picture a circle surrounding the rich-people-only city of Eden. That circle is divided into eight wedge-shaped sectors, which provide heavy industries, black-market pleasures, and slums for agricultural workers.) Dallas is an intensely possessive, personally charismatic semi-dictator who cultivates the image of a dangerous barbarian to everyone outside and most of the people inside Sector Four. Since he's supposed to be one of the good guys, this is more image than reality, but it's not entirely disconnected from reality. This book is about Lex and Dallas forming an actual relationship, instead of the fraught and complicated thing they had in the first book. I was hoping that this would involve Dallas becoming less of an asshole. It unfortunately does not, although some of what I attributed to malice may be adequately explained by stupidity. I'm not sure that's an improvement. Lex is great, just like she was in the first book. It's obvious by this point in the series that she does most of the emotional labor of keeping the gang running, and her support is central to Dallas's success. Like most of the people in this story, she has a nasty and abusive background that she's still dealing with in various ways. Dallas's possessiveness is intensely appealing to her, but she wants that possessiveness on different terms than Dallas may be willing to offer, or is even aware of. Lex was, I thought, exceptionally clear about what she wanted out of this relationship. Dallas thinks this is entirely about sex, and is, in general, dumber than a sack of hammers. That means fights. Also orgies, but, well, hopefully you knew what you were getting into if you picked up this book. I know, I know, it's erotica, that's the whole point, but these people have a truly absurd amount of sex. Eden puts birth control in the water supply, which is a neat way to simplify some of the in-story consequences of erotica. They must be putting aphrodisiacs in the water supply as well. There was a lot of sector politics in this book that I found way more interesting than it had any right to be. I really like most of these people, even Dallas when he manages to get his three brain cells connected for more than a few minutes. The events of the first book have a lot of significant fallout, Lex continues being a badass, the social dynamics between the women are very well-done (and pass the Bechdel test yet again even though this is mostly traditional-gender-role erotica), and if Dallas had managed to understand what he did wrong at a deeper-than-emotional level, I would have rather enjoyed the non-erotica story parts. Alas. I therefore wouldn't recommend this book even if I were willing to offer any recommendations about erotica (which I'm not). I was hoping it was going somewhere more rewarding than it did. But I still kind of want to read another one? I am weirdly fascinated with the lives of these people. The next book is about Six, who has the potential to turn into the sort of snarky, cynical character I love reading about. And it's not that hard to skim over the orgies. Maybe Dallas will get one additional brain cell per book? Followed by Beyond Pain. Rating: 5 out of 10

30 August 2022

John Goerzen: The PC & Internet Revolution in Rural America

Inspired by several others (such as Alex Schroeder s post and Szcze uja s prompt), as well as a desire to get this down for my kids, I figure it s time to write a bit about living through the PC and Internet revolution where I did: outside a tiny town in rural Kansas. And, as I ve been back in that same area for the past 15 years, I reflect some on the challenges that continue to play out. Although the stories from the others were primarily about getting online, I want to start by setting some background. Those of you that didn t grow up in the same era as I did probably never realized that a typical business PC setup might cost $10,000 in today s dollars, for instance. So let me start with the background.

Nothing was easy This story begins in the 1980s. Somewhere around my Kindergarten year of school, around 1985, my parents bought a TRS-80 Color Computer 2 (aka CoCo II). It had 64K of RAM and used a TV for display and sound. This got you the computer. It didn t get you any disk drive or anything, no joysticks (required by a number of games). So whenever the system powered down, or it hung and you had to power cycle it a frequent event you d lose whatever you were doing and would have to re-enter the program, literally by typing it in. The floppy drive for the CoCo II cost more than the computer, and it was quite common for people to buy the computer first and then the floppy drive later when they d saved up the money for that. I particularly want to mention that computers then didn t come with a modem. What would be like buying a laptop or a tablet without wifi today. A modem, which I ll talk about in a bit, was another expensive accessory. To cobble together a system in the 80s that was capable of talking to others with persistent storage (floppy, or hard drive), screen, keyboard, and modem would be quite expensive. Adjusted for inflation, if you re talking a PC-style device (a clone of the IBM PC that ran DOS), this would easily be more expensive than the Macbook Pros of today. Few people back in the 80s had a computer at home. And the portion of those that had even the capability to get online in a meaningful way was even smaller. Eventually my parents bought a PC clone with 640K RAM and dual floppy drives. This was primarily used for my mom s work, but I did my best to take it over whenever possible. It ran DOS and, despite its monochrome screen, was generally a more capable machine than the CoCo II. For instance, it supported lowercase. (I m not even kidding; the CoCo II pretty much didn t.) A while later, they purchased a 32MB hard drive for it what luxury! Just getting a machine to work wasn t easy. Say you d bought a PC, and then bought a hard drive, and a modem. You didn t just plug in the hard drive and it would work. You would have to fight it every step of the way. The BIOS and DOS partition tables of the day used a cylinder/head/sector method of addressing the drive, and various parts of that those addresses had too few bits to work with the big drives of the day above 20MB. So you would have to lie to the BIOS and fdisk in various ways, and sort of work out how to do it for each drive. For each peripheral serial port, sound card (in later years), etc., you d have to set jumpers for DMA and IRQs, hoping not to conflict with anything already in the system. Perhaps you can now start to see why USB and PCI were so welcomed.

Sharing and finding resources Despite the two computers in our home, it wasn t as if software written on one machine just ran on another. A lot of software for PC clones assumed a CGA color display. The monochrome HGC in our PC wasn t particularly compatible. You could find a TSR program to emulate the CGA on the HGC, but it wasn t particularly stable, and there s only so much you can do when a program that assumes color displays on a monitor that can only show black, dark amber, or light amber. So I d periodically get to use other computers most commonly at an office in the evening when it wasn t being used. There were some local computer clubs that my dad took me to periodically. Software was swapped back then; disks copied, shareware exchanged, and so forth. For me, at least, there was no online to download software from, and selling software over the Internet wasn t a thing at all.

Three Different Worlds There were sort of three different worlds of computing experience in the 80s:
  1. Home users. Initially using a wide variety of software from Apple, Commodore, Tandy/RadioShack, etc., but eventually coming to be mostly dominated by IBM PC clones
  2. Small and mid-sized business users. Some of them had larger minicomputers or small mainframes, but most that I had contact with by the early 90s were standardized on DOS-based PCs. More advanced ones had a network running Netware, most commonly. Networking hardware and software was generally too expensive for home users to use in the early days.
  3. Universities and large institutions. These are the places that had the mainframes, the earliest implementations of TCP/IP, the earliest users of UUCP, and so forth.
The difference between the home computing experience and the large institution experience were vast. Not only in terms of dollars the large institution hardware could easily cost anywhere from tens of thousands to millions of dollars but also in terms of sheer resources required (large rooms, enormous power circuits, support staff, etc). Nothing was in common between them; not operating systems, not software, not experience. I was never much aware of the third category until the differences started to collapse in the mid-90s, and even then I only was exposed to it once the collapse was well underway. You might say to me, Well, Google certainly isn t running what I m running at home! And, yes of course, it s different. But fundamentally, most large datacenters are running on x86_64 hardware, with Linux as the operating system, and a TCP/IP network. It s a different scale, obviously, but at a fundamental level, the hardware and operating system stack are pretty similar to what you can readily run at home. Back in the 80s and 90s, this wasn t the case. TCP/IP wasn t even available for DOS or Windows until much later, and when it was, it was a clunky beast that was difficult. One of the things Kevin Driscoll highlights in his book called Modem World see my short post about it is that the history of the Internet we usually receive is focused on case 3: the large institutions. In reality, the Internet was and is literally a network of networks. Gateways to and from Internet existed from all three kinds of users for years, and while TCP/IP ultimately won the battle of the internetworking protocol, the other two streams of users also shaped the Internet as we now know it. Like many, I had no access to the large institution networks, but as I ve been reflecting on my experiences, I ve found a new appreciation for the way that those of us that grew up with primarily home PCs shaped the evolution of today s online world also.

An Era of Scarcity I should take a moment to comment about the cost of software back then. A newspaper article from 1985 comments that WordPerfect, then the most powerful word processing program, sold for $495 (or $219 if you could score a mail order discount). That s $1360/$600 in 2022 money. Other popular software, such as Lotus 1-2-3, was up there as well. If you were to buy a new PC clone in the mid to late 80s, it would often cost $2000 in 1980s dollars. Now add a printer a low-end dot matrix for $300 or a laser for $1500 or even more. A modem: another $300. So the basic system would be $3600, or $9900 in 2022 dollars. If you wanted a nice printer, you re now pushing well over $10,000 in 2022 dollars. You start to see one barrier here, and also why things like shareware and piracy if it was indeed even recognized as such were common in those days. So you can see, from a home computer setup (TRS-80, Commodore C64, Apple ][, etc) to a business-class PC setup was an order of magnitude increase in cost. From there to the high-end minis/mainframes was another order of magnitude (at least!) increase. Eventually there was price pressure on the higher end and things all got better, which is probably why the non-DOS PCs lasted until the early 90s.

Increasing Capabilities My first exposure to computers in school was in the 4th grade, when I would have been about 9. There was a single Apple ][ machine in that room. I primarily remember playing Oregon Trail on it. The next year, the school added a computer lab. Remember, this is a small rural area, so each graduating class might have about 25 people in it; this lab was shared by everyone in the K-8 building. It was full of some flavor of IBM PS/2 machines running DOS and Netware. There was a dedicated computer teacher too, though I think she was a regular teacher that was given somewhat minimal training on computers. We were going to learn typing that year, but I did so well on the very first typing program that we soon worked out that I could do programming instead. I started going to school early these machines were far more powerful than the XT at home and worked on programming projects there. Eventually my parents bought me a Gateway 486SX/25 with a VGA monitor and hard drive. Wow! This was a whole different world. It may have come with Windows 3.0 or 3.1 on it, but I mainly remember running OS/2 on that machine. More on that below.

Programming That CoCo II came with a BASIC interpreter in ROM. It came with a large manual, which served as a BASIC tutorial as well. The BASIC interpreter was also the shell, so literally you could not use the computer without at least a bit of BASIC. Once I had access to a DOS machine, it also had a basic interpreter: GW-BASIC. There was a fair bit of software written in BASIC at the time, but most of the more advanced software wasn t. I wondered how these .EXE and .COM programs were written. I could find vague references to DEBUG.EXE, assemblers, and such. But it wasn t until I got a copy of Turbo Pascal that I was able to do that sort of thing myself. Eventually I got Borland C++ and taught myself C as well. A few years later, I wanted to try writing GUI programs for Windows, and bought Watcom C++ much cheaper than the competition, and it could target Windows, DOS (and I think even OS/2). Notice that, aside from BASIC, none of this was free, and none of it was bundled. You couldn t just download a C compiler, or Python interpreter, or whatnot back then. You had to pay for the ability to write any kind of serious code on the computer you already owned.

The Microsoft Domination Microsoft came to dominate the PC landscape, and then even the computing landscape as a whole. IBM very quickly lost control over the hardware side of PCs as Compaq and others made clones, but Microsoft has managed in varying degrees even to this day to keep a stranglehold on the software, and especially the operating system, side. Yes, there was occasional talk of things like DR-DOS, but by and large the dominant platform came to be the PC, and if you had a PC, you ran DOS (and later Windows) from Microsoft. For awhile, it looked like IBM was going to challenge Microsoft on the operating system front; they had OS/2, and when I switched to it sometime around the version 2.1 era in 1993, it was unquestionably more advanced technically than the consumer-grade Windows from Microsoft at the time. It had Internet support baked in, could run most DOS and Windows programs, and had introduced a replacement for the by-then terrible FAT filesystem: HPFS, in 1988. Microsoft wouldn t introduce a better filesystem for its consumer operating systems until Windows XP in 2001, 13 years later. But more on that story later.

Free Software, Shareware, and Commercial Software I ve covered the high cost of software already. Obviously $500 software wasn t going to sell in the home market. So what did we have? Mainly, these things:
  1. Public domain software. It was free to use, and if implemented in BASIC, probably had source code with it too.
  2. Shareware
  3. Commercial software (some of it from small publishers was a lot cheaper than $500)
Let s talk about shareware. The idea with shareware was that a company would release a useful program, sometimes limited. You were encouraged to register , or pay for, it if you liked it and used it. And, regardless of whether you registered it or not, were told please copy! Sometimes shareware was fully functional, and registering it got you nothing more than printed manuals and an easy conscience (guilt trips for not registering weren t necessarily very subtle). Sometimes unregistered shareware would have a nag screen a delay of a few seconds while they told you to register. Sometimes they d be limited in some way; you d get more features if you registered. With games, it was popular to have a trilogy, and release the first episode inevitably ending with a cliffhanger as shareware, and the subsequent episodes would require registration. In any event, a lot of software people used in the 80s and 90s was shareware. Also pirated commercial software, though in the earlier days of computing, I think some people didn t even know the difference. Notice what s missing: Free Software / FLOSS in the Richard Stallman sense of the word. Stallman lived in the big institution world after all, he worked at MIT and what he was doing with the Free Software Foundation and GNU project beginning in 1983 never really filtered into the DOS/Windows world at the time. I had no awareness of it even existing until into the 90s, when I first started getting some hints of it as a port of gcc became available for OS/2. The Internet was what really brought this home, but I m getting ahead of myself. I want to say again: FLOSS never really entered the DOS and Windows 3.x ecosystems. You d see it make a few inroads here and there in later versions of Windows, and moreso now that Microsoft has been sort of forced to accept it, but still, reflect on its legacy. What is the software market like in Windows compared to Linux, even today? Now it is, finally, time to talk about connectivity!

Getting On-Line What does it even mean to get on line? Certainly not connecting to a wifi access point. The answer is, unsurprisingly, complex. But for everyone except the large institutional users, it begins with a telephone.

The telephone system By the 80s, there was one communication network that already reached into nearly every home in America: the phone system. Virtually every household (note I don t say every person) was uniquely identified by a 10-digit phone number. You could, at least in theory, call up virtually any other phone in the country and be connected in less than a minute. But I ve got to talk about cost. The way things worked in the USA, you paid a monthly fee for a phone line. Included in that monthly fee was unlimited local calling. What is a local call? That was an extremely complex question. Generally it meant, roughly, calling within your city. But of course, as you deal with things like suburbs and cities growing into each other (eg, the Dallas-Ft. Worth metroplex), things got complicated fast. But let s just say for simplicity you could call others in your city. What about calling people not in your city? That was long distance , and you paid often hugely by the minute for it. Long distance rates were difficult to figure out, but were generally most expensive during business hours and cheapest at night or on weekends. Prices eventually started to come down when competition was introduced for long distance carriers, but even then you often were stuck with a single carrier for long distance calls outside your city but within your state. Anyhow, let s just leave it at this: local calls were virtually free, and long distance calls were extremely expensive.

Getting a modem I remember getting a modem that ran at either 1200bps or 2400bps. Either way, quite slow; you could often read even plain text faster than the modem could display it. But what was a modem? A modem hooked up to a computer with a serial cable, and to the phone system. By the time I got one, modems could automatically dial and answer. You would send a command like ATDT5551212 and it would dial 555-1212. Modems had speakers, because often things wouldn t work right, and the telephone system was oriented around speech, so you could hear what was happening. You d hear it wait for dial tone, then dial, then hopefully the remote end would ring, a modem there would answer, you d hear the screeching of a handshake, and eventually your terminal would say CONNECT 2400. Now your computer was bridged to the other; anything going out your serial port was encoded as sound by your modem and decoded at the other end, and vice-versa. But what, exactly, was the other end? It might have been another person at their computer. Turn on local echo, and you can see what they did. Maybe you d send files to each other. But in my case, the answer was different: PC Magazine.

PC Magazine and CompuServe Starting around 1986 (so I would have been about 6 years old), I got to read PC Magazine. My dad would bring copies that were being discarded at his office home for me to read, and I think eventually bought me a subscription directly. This was not just a standard magazine; it ran something like 350-400 pages an issue, and came out every other week. This thing was a monster. It had reviews of hardware and software, descriptions of upcoming technologies, pages and pages of ads (that often had some degree of being informative to them). And they had sections on programming. Many issues would talk about BASIC or Pascal programming, and there d be a utility in most issues. What do I mean by a utility in most issues ? Did they include a floppy disk with software? No, of course not. There was a literal program listing printed in the magazine. If you wanted the utility, you had to type it in. And a lot of them were written in assembler, so you had to have an assembler. An assembler, of course, was not free and I didn t have one. Or maybe they wrote it in Microsoft C, and I had Borland C, and (of course) they weren t compatible. Sometimes they would list the program sort of in binary: line after line of a BASIC program, with lines like 64, 193, 253, 0, 53, 0, 87 that you would type in for hours, hopefully correctly. Running the BASIC program would, if you got it correct, emit a .COM file that you could then run. They did have a rudimentary checksum system built in, but it wasn t even a CRC, so something like swapping two numbers you d never notice except when the program would mysteriously hang. Eventually they teamed up with CompuServe to offer a limited slice of CompuServe for the purpose of downloading PC Magazine utilities. This was called PC MagNet. I am foggy on the details, but I believe that for a time you could connect to the limited PC MagNet part of CompuServe for free (after the cost of the long-distance call, that is) rather than paying for CompuServe itself (because, OF COURSE, that also charged you per the minute.) So in the early days, I would get special permission from my parents to place a long distance call, and after some nerve-wracking minutes in which we were aware every minute was racking up charges, I could navigate the menus, download what I wanted, and log off immediately. I still, incidentally, mourn what PC Magazine became. As with computing generally, it followed the mass market. It lost its deep technical chops, cut its programming columns, stopped talking about things like how SCSI worked, and so forth. By the time it stopped printing in 2009, it was no longer a square-bound 400-page beheamoth, but rather looked more like a copy of Newsweek, but with less depth.

Continuing with CompuServe CompuServe was a much larger service than just PC MagNet. Eventually, our family got a subscription. It was still an expensive and scarce resource; I d call it only after hours when the long-distance rates were cheapest. Everyone had a numerical username separated by commas; mine was 71510,1421. CompuServe had forums, and files. Eventually I would use TapCIS to queue up things I wanted to do offline, to minimize phone usage online. CompuServe eventually added a gateway to the Internet. For the sum of somewhere around $1 a message, you could send or receive an email from someone with an Internet email address! I remember the thrill of one time, as a kid of probably 11 years, sending a message to one of the editors of PC Magazine and getting a kind, if brief, reply back! But inevitably I had

The Godzilla Phone Bill Yes, one month I became lax in tracking my time online. I ran up my parents phone bill. I don t remember how high, but I remember it was hundreds of dollars, a hefty sum at the time. As I watched Jason Scott s BBS Documentary, I realized how common an experience this was. I think this was the end of CompuServe for me for awhile.

Toll-Free Numbers I lived near a town with a population of 500. Not even IN town, but near town. The calling area included another town with a population of maybe 1500, so all told, there were maybe 2000 people total I could talk to with a local call though far fewer numbers, because remember, telephones were allocated by the household. There was, as far as I know, zero modems that were a local call (aside from one that belonged to a friend I met in around 1992). So basically everything was long-distance. But there was a special feature of the telephone network: toll-free numbers. Normally when calling long-distance, you, the caller, paid the bill. But with a toll-free number, beginning with 1-800, the recipient paid the bill. These numbers almost inevitably belonged to corporations that wanted to make it easy for people to call. Sales and ordering lines, for instance. Some of these companies started to set up modems on toll-free numbers. There were few of these, but they existed, so of course I had to try them! One of them was a company called PennyWise that sold office supplies. They had a toll-free line you could call with a modem to order stuff. Yes, online ordering before the web! I loved office supplies. And, because I lived far from a big city, if the local K-Mart didn t have it, I probably couldn t get it. Of course, the interface was entirely text, but you could search for products and place orders with the modem. I had loads of fun exploring the system, and actually ordered things from them and probably actually saved money doing so. With the first order they shipped a monster full-color catalog. That thing must have been 500 pages, like the Sears catalogs of the day. Every item had a part number, which streamlined ordering through the modem.

Inbound FAXes By the 90s, a number of modems became able to send and receive FAXes as well. For those that don t know, a FAX machine was essentially a special modem. It would scan a page and digitally transmit it over the phone system, where it would at least in the early days be printed out in real time (because the machines didn t have the memory to store an entire page as an image). Eventually, PC modems integrated FAX capabilities. There still wasn t anything useful I could do locally, but there were ways I could get other companies to FAX something to me. I remember two of them. One was for US Robotics. They had an on demand FAX system. You d call up a toll-free number, which was an automated IVR system. You could navigate through it and select various documents of interest to you: spec sheets and the like. You d key in your FAX number, hang up, and US Robotics would call YOU and FAX you the documents you wanted. Yes! I was talking to a computer (of a sorts) at no cost to me! The New York Times also ran a service for awhile called TimesFax. Every day, they would FAX out a page or two of summaries of the day s top stories. This was pretty cool in an era in which I had no other way to access anything from the New York Times. I managed to sign up for TimesFax I have no idea how, anymore and for awhile I would get a daily FAX of their top stories. When my family got its first laser printer, I could them even print these FAXes complete with the gothic New York Times masthead. Wow! (OK, so technically I could print it on a dot-matrix printer also, but graphics on a 9-pin dot matrix is a kind of pain that is a whole other article.)

My own phone line Remember how I discussed that phone lines were allocated per household? This was a problem for a lot of reasons:
  1. Anybody that tried to call my family while I was using my modem would get a busy signal (unable to complete the call)
  2. If anybody in the house picked up the phone while I was using it, that would degrade the quality of the ongoing call and either mess up or disconnect the call in progress. In many cases, that could cancel a file transfer (which wasn t necessarily easy or possible to resume), prompting howls of annoyance from me.
  3. Generally we all had to work around each other
So eventually I found various small jobs and used the money I made to pay for my own phone line and my own long distance costs. Eventually I upgraded to a 28.8Kbps US Robotics Courier modem even! Yes, you heard it right: I got a job and a bank account so I could have a phone line and a faster modem. Uh, isn t that why every teenager gets a job? Now my local friend and I could call each other freely at least on my end (I can t remember if he had his own phone line too). We could exchange files using HS/Link, which had the added benefit of allowing split-screen chat even while a file transfer is in progress. I m sure we spent hours chatting to each other keyboard-to-keyboard while sharing files with each other.

Technology in Schools By this point in the story, we re in the late 80s and early 90s. I m still using PC-style OSs at home; OS/2 in the later years of this period, DOS or maybe a bit of Windows in the earlier years. I mentioned that they let me work on programming at school starting in 5th grade. It was soon apparent that I knew more about computers than anybody on staff, and I started getting pulled out of class to help teachers or administrators with vexing school problems. This continued until I graduated from high school, incidentally often to my enjoyment, and the annoyance of one particular teacher who, I must say, I was fine with annoying in this way. That s not to say that there was institutional support for what I was doing. It was, after all, a small school. Larger schools might have introduced BASIC or maybe Logo in high school. But I had already taught myself BASIC, Pascal, and C by the time I was somewhere around 12 years old. So I wouldn t have had any use for that anyhow. There were programming contests occasionally held in the area. Schools would send teams. My school didn t really send anybody, but I went as an individual. One of them was run by a local college (but for jr. high or high school students. Years later, I met one of the professors that ran it. He remembered me, and that day, better than I did. The programming contest had problems one could solve in BASIC or Logo. I knew nothing about what to expect going into it, but I had lugged my computer and screen along, and asked him, Can I write my solutions in C? He was, apparently, stunned, but said sure, go for it. I took first place that day, leading to some rather confused teams from much larger schools. The Netware network that the school had was, as these generally were, itself isolated. There was no link to the Internet or anything like it. Several schools across three local counties eventually invested in a fiber-optic network linking them together. This built a larger, but still closed, network. Its primary purpose was to allow students to be exposed to a wider variety of classes at high schools. Participating schools had an ITV room , outfitted with cameras and mics. So students at any school could take classes offered over ITV at other schools. For instance, only my school taught German classes, so people at any of those participating schools could take German. It was an early Zoom room. But alongside the TV signal, there was enough bandwidth to run some Netware frames. By about 1995 or so, this let one of the schools purchase some CD-ROM software that was made available on a file server and could be accessed by any participating school. Nice! But Netware was mainly about file and printer sharing; there wasn t even a facility like email, at least not on our deployment.

BBSs My last hop before the Internet was the BBS. A BBS was a computer program, usually ran by a hobbyist like me, on a computer with a modem connected. Callers would call it up, and they d interact with the BBS. Most BBSs had discussion groups like forums and file areas. Some also had games. I, of course, continued to have that most vexing of problems: they were all long-distance. There were some ways to help with that, chiefly QWK and BlueWave. These, somewhat like TapCIS in the CompuServe days, let me download new message posts for reading offline, and queue up my own messages to send later. QWK and BlueWave didn t help with file downloading, though.

BBSs get networked BBSs were an interesting thing. You d call up one, and inevitably somewhere in the file area would be a BBS list. Download the BBS list and you ve suddenly got a list of phone numbers to try calling. All of them were long distance, of course. You d try calling them at random and have a success rate of maybe 20%. The other 80% would be defunct; you might get the dreaded this number is no longer in service or the even more dreaded angry human answering the phone (and of course a modem can t talk to a human, so they d just get silence for probably the nth time that week). The phone company cared nothing about BBSs and recycled their numbers just as fast as any others. To talk to various people, or participate in certain discussion groups, you d have to call specific BBSs. That s annoying enough in the general case, but even more so for someone paying long distance for it all, because it takes a few minutes to establish a connection to a BBS: handshaking, logging in, menu navigation, etc. But BBSs started talking to each other. The earliest successful such effort was FidoNet, and for the duration of the BBS era, it remained by far the largest. FidoNet was analogous to the UUCP that the institutional users had, but ran on the much cheaper PC hardware. Basically, BBSs that participated in FidoNet would relay email, forum posts, and files between themselves overnight. Eventually, as with UUCP, by hopping through this network, messages could reach around the globe, and forums could have worldwide participation asynchronously, long before they could link to each other directly via the Internet. It was almost entirely volunteer-run.

Running my own BBS At age 13, I eventually chose to set up my own BBS. It ran on my single phone line, so of course when I was dialing up something else, nobody could dial up me. Not that this was a huge problem; in my town of 500, I probably had a good 1 or 2 regular callers in the beginning. In the PC era, there was a big difference between a server and a client. Server-class software was expensive and rare. Maybe in later years you had an email client, but an email server would be completely unavailable to you as a home user. But with a BBS, I could effectively run a server. I even ran serial lines in our house so that the BBS could be connected from other rooms! Since I was running OS/2, the BBS didn t tie up the computer; I could continue using it for other things. FidoNet had an Internet email gateway. This one, unlike CompuServe s, was free. Once I had a BBS on FidoNet, you could reach me from the Internet using the FidoNet address. This didn t support attachments, but then email of the day didn t really, either. Various others outside Kansas ran FidoNet distribution points. I believe one of them was mgmtsys; my memory is quite vague, but I think they offered a direct gateway and I would call them to pick up Internet mail via FidoNet protocols, but I m not at all certain of this.

Pros and Cons of the Non-Microsoft World As mentioned, Microsoft was and is the dominant operating system vendor for PCs. But I left that world in 1993, and here, nearly 30 years later, have never really returned. I got an operating system with more technical capabilities than the DOS and Windows of the day, but the tradeoff was a much smaller software ecosystem. OS/2 could run DOS programs, but it ran OS/2 programs a lot better. So if I were to run a BBS, I wanted one that had a native OS/2 version limiting me to a small fraction of available BBS server software. On the other hand, as a fully 32-bit operating system, there started to be OS/2 ports of certain software with a Unix heritage; most notably for me at the time, gcc. At some point, I eventually came across the RMS essays and started to be hooked.

Internet: The Hunt Begins I certainly was aware that the Internet was out there and interesting. But the first problem was: how the heck do I get connected to the Internet?

Computer labs There was one place that tended to have Internet access: colleges and universities. In 7th grade, I participated in a program that resulted in me being invited to visit Duke University, and in 8th grade, I participated in National History Day, resulting in a trip to visit the University of Maryland. I probably sought out computer labs at both of those. My most distinct memory was finding my way into a computer lab at one of those universities, and it was full of NeXT workstations. I had never seen or used NeXT before, and had no idea how to operate it. I had brought a box of floppy disks, unaware that the DOS disks probably weren t compatible with NeXT. Closer to home, a small college had a computer lab that I could also visit. I would go there in summer or when it wasn t used with my stack of floppies. I remember downloading disk images of FLOSS operating systems: FreeBSD, Slackware, or Debian, at the time. The hash marks from the DOS-based FTP client would creep across the screen as the 1.44MB disk images would slowly download. telnet was also available on those machines, so I could telnet to things like public-access Archie servers and libraries though not Gopher. Still, FTP and telnet access opened up a lot, and I learned quite a bit in those years.

Continuing the Journey At some point, I got a copy of the Whole Internet User s Guide and Catalog, published in 1994. I still have it. If it hadn t already figured it out by then, I certainly became aware from it that Unix was the dominant operating system on the Internet. The examples in Whole Internet covered FTP, telnet, gopher all assuming the user somehow got to a Unix prompt. The web was introduced about 300 pages in; clearly viewed as something that wasn t page 1 material. And it covered the command-line www client before introducing the graphical Mosaic. Even then, though, the book highlighted Mosaic s utility as a front-end for Gopher and FTP, and even the ability to launch telnet sessions by clicking on links. But having a copy of the book didn t equate to having any way to run Mosaic. The machines in the computer lab I mentioned above all ran DOS and were incapable of running a graphical browser. I had no SLIP or PPP (both ways to run Internet traffic over a modem) connectivity at home. In short, the Web was something for the large institutional users at the time.

CD-ROMs As CD-ROMs came out, with their huge (for the day) 650MB capacity, various companies started collecting software that could be downloaded on the Internet and selling it on CD-ROM. The two most popular ones were Walnut Creek CD-ROM and Infomagic. One could buy extensive Shareware and gaming collections, and then even entire Linux and BSD distributions. Although not exactly an Internet service per se, it was a way of bringing what may ordinarily only be accessible to institutional users into the home computer realm.

Free Software Jumps In As I mentioned, by the mid 90s, I had come across RMS s writings about free software most probably his 1992 essay Why Software Should Be Free. (Please note, this is not a commentary on the more recently-revealed issues surrounding RMS, but rather his writings and work as I encountered them in the 90s.) The notion of a Free operating system not just in cost but in openness was incredibly appealing. Not only could I tinker with it to a much greater extent due to having source for everything, but it included so much software that I d otherwise have to pay for. Compilers! Interpreters! Editors! Terminal emulators! And, especially, server software of all sorts. There d be no way I could afford or run Netware, but with a Free Unixy operating system, I could do all that. My interest was obviously piqued. Add to that the fact that I could actually participate and contribute I was about to become hooked on something that I ve stayed hooked on for decades. But then the question was: which Free operating system? Eventually I chose FreeBSD to begin with; that would have been sometime in 1995. I don t recall the exact reasons for that. I remember downloading Slackware install floppies, and probably the fact that Debian wasn t yet at 1.0 scared me off for a time. FreeBSD s fantastic Handbook far better than anything I could find for Linux at the time was no doubt also a factor.

The de Raadt Factor Why not NetBSD or OpenBSD? The short answer is Theo de Raadt. Somewhere in this time, when I was somewhere between 14 and 16 years old, I asked some questions comparing NetBSD to the other two free BSDs. This was on a NetBSD mailing list, but for some reason Theo saw it and got a flame war going, which CC d me. Now keep in mind that even if NetBSD had a web presence at the time, it would have been minimal, and I would have not all that unusually for the time had no way to access it. I was certainly not aware of the, shall we say, acrimony between Theo and NetBSD. While I had certainly seen an online flamewar before, this took on a different and more disturbing tone; months later, Theo randomly emailed me under the subject SLIME saying that I was, well, SLIME . I seem to recall periodic emails from him thereafter reminding me that he hates me and that he had blocked me. (Disclaimer: I have poor email archives from this period, so the full details are lost to me, but I believe I am accurately conveying these events from over 25 years ago) This was a surprise, and an unpleasant one. I was trying to learn, and while it is possible I didn t understand some aspect or other of netiquette (or Theo s personal hatred of NetBSD) at the time, still that is not a reason to flame a 16-year-old (though he would have had no way to know my age). This didn t leave any kind of scar, but did leave a lasting impression; to this day, I am particularly concerned with how FLOSS projects handle poisonous people. Debian, for instance, has come a long way in this over the years, and even Linus Torvalds has turned over a new leaf. I don t know if Theo has. In any case, I didn t use NetBSD then. I did try it periodically in the years since, but never found it compelling enough to justify a large switch from Debian. I never tried OpenBSD for various reasons, but one of them was that I didn t want to join a community that tolerates behavior such as Theo s from its leader.

Moving to FreeBSD Moving from OS/2 to FreeBSD was final. That is, I didn t have enough hard drive space to keep both. I also didn t have the backup capacity to back up OS/2 completely. My BBS, which ran Virtual BBS (and at some point also AdeptXBBS) was deleted and reincarnated in a different form. My BBS was a member of both FidoNet and VirtualNet; the latter was specific to VBBS, and had to be dropped. I believe I may have also had to drop the FidoNet link for a time. This was the biggest change of computing in my life to that point. The earlier experiences hadn t literally destroyed what came before. OS/2 could still run my DOS programs. Its command shell was quite DOS-like. It ran Windows programs. I was going to throw all that away and leap into the unknown. I wish I had saved a copy of my BBS; I would love to see the messages I exchanged back then, or see its menu screens again. I have little memory of what it looked like. But other than that, I have no regrets. Pursuing Free, Unixy operating systems brought me a lot of enjoyment and a good career. That s not to say it was easy. All the problems of not being in the Microsoft ecosystem were magnified under FreeBSD and Linux. In a day before EDID, monitor timings had to be calculated manually and you risked destroying your monitor if you got them wrong. Word processing and spreadsheet software was pretty much not there for FreeBSD or Linux at the time; I was therefore forced to learn LaTeX and actually appreciated that. Software like PageMaker or CorelDraw was certainly nowhere to be found for those free operating systems either. But I got a ton of new capabilities. I mentioned the BBS didn t shut down, and indeed it didn t. I ran what was surely a supremely unique oddity: a free, dialin Unix shell server in the middle of a small town in Kansas. I m sure I provided things such as pine for email and some help text and maybe even printouts for how to use it. The set of callers slowly grew over the time period, in fact. And then I got UUCP.

Enter UUCP Even throughout all this, there was no local Internet provider and things were still long distance. I had Internet Email access via assorted strange routes, but they were all strange. And, I wanted access to Usenet. In 1995, it happened. The local ISP I mentioned offered UUCP access. Though I couldn t afford the dialup shell (or later, SLIP/PPP) that they offered due to long-distance costs, UUCP s very efficient batched processes looked doable. I believe I established that link when I was 15, so in 1995. I worked to register my domain, complete.org, as well. At the time, the process was a bit lengthy and involved downloading a text file form, filling it out in a precise way, sending it to InterNIC, and probably mailing them a check. Well I did that, and in September of 1995, complete.org became mine. I set up sendmail on my local system, as well as INN to handle the limited Usenet newsfeed I requested from the ISP. I even ran Majordomo to host some mailing lists, including some that were surprisingly high-traffic for a few-times-a-day long-distance modem UUCP link! The modem client programs for FreeBSD were somewhat less advanced than for OS/2, but I believe I wound up using Minicom or Seyon to continue to dial out to BBSs and, I believe, continue to use Learning Link. So all the while I was setting up my local BBS, I continued to have access to the text Internet, consisting of chiefly Gopher for me.

Switching to Debian I switched to Debian sometime in 1995 or 1996, and have been using Debian as my primary OS ever since. I continued to offer shell access, but added the WorldVU Atlantis menuing BBS system. This provided a return of a more BBS-like interface (by default; shell was still an uption) as well as some BBS door games such as LoRD and TradeWars 2002, running under DOS emulation. I also continued to run INN, and ran ifgate to allow FidoNet echomail to be presented into INN Usenet-like newsgroups, and netmail to be gated to Unix email. This worked pretty well. The BBS continued to grow in these days, peaking at about two dozen total user accounts, and maybe a dozen regular users.

Dial-up access availability I believe it was in 1996 that dial up PPP access finally became available in my small town. What a thrill! FINALLY! I could now FTP, use Gopher, telnet, and the web all from home. Of course, it was at modem speeds, but still. (Strangely, I have a memory of accessing the Web using WebExplorer from OS/2. I don t know exactly why; it s possible that by this time, I had upgraded to a 486 DX2/66 and was able to reinstall OS/2 on the old 25MHz 486, or maybe something was wrong with the timeline from my memories from 25 years ago above. Or perhaps I made the occasional long-distance call somewhere before I ditched OS/2.) Gopher sites still existed at this point, and I could access them using Netscape Navigator which likely became my standard Gopher client at that point. I don t recall using UMN text-mode gopher client locally at that time, though it s certainly possible I did.

The city Starting when I was 15, I took computer science classes at Wichita State University. The first one was a class in the summer of 1995 on C++. I remember being worried about being good enough for it I was, after all, just after my HS freshman year and had never taken the prerequisite C class. I loved it and got an A! By 1996, I was taking more classes. In 1996 or 1997 I stayed in Wichita during the day due to having more than one class. So, what would I do then but enjoy the computer lab? The CS dept. had two of them: one that had NCD X terminals connected to a pair of SunOS servers, and another one running Windows. I spent most of the time in the Unix lab with the NCDs; I d use Netscape or pine, write code, enjoy the University s fast Internet connection, and so forth. In 1997 I had graduated high school and that summer I moved to Wichita to attend college. As was so often the case, I shut down the BBS at that time. It would be 5 years until I again dealt with Internet at home in a rural community. By the time I moved to my apartment in Wichita, I had stopped using OS/2 entirely. I have no memory of ever having OS/2 there. Along the way, I had bought a Pentium 166, and then the most expensive piece of computing equipment I have ever owned: a DEC Alpha, which, of course, ran Linux.

ISDN I must have used dialup PPP for a time, but I eventually got a job working for the ISP I had used for UUCP, and then PPP. While there, I got a 128Kbps ISDN line installed in my apartment, and they gave me a discount on the service for it. That was around 3x the speed of a modem, and crucially was always on and gave me a public IP. No longer did I have to use UUCP; now I got to host my own things! By at least 1998, I was running a web server on www.complete.org, and I had an FTP server going as well.

Even Bigger Cities In 1999 I moved to Dallas, and there got my first broadband connection: an ADSL link at, I think, 1.5Mbps! Now that was something! But it had some reliability problems. I eventually put together a server and had it hosted at an acquantaince s place who had SDSL in his apartment. Within a couple of years, I had switched to various kinds of proper hosting for it, but that is a whole other article. In Indianapolis, I got a cable modem for the first time, with even tighter speeds but prohibitions on running servers on it. Yuck.

Challenges Being non-Microsoft continued to have challenges. Until the advent of Firefox, a web browser was one of the biggest. While Netscape supported Linux on i386, it didn t support Linux on Alpha. I hobbled along with various attempts at emulators, old versions of Mosaic, and so forth. And, until StarOffice was open-sourced as Open Office, reading Microsoft file formats was also a challenge, though WordPerfect was briefly available for Linux. Over the years, I have become used to the Linux ecosystem. Perhaps I use Gimp instead of Photoshop and digikam instead of well, whatever somebody would use on Windows. But I get ZFS, and containers, and so much that isn t available there. Yes, I know Apple never went away and is a thing, but for most of the time period I discuss in this article, at least after the rise of DOS, it was niche compared to the PC market.

Back to Kansas In 2002, I moved back to Kansas, to a rural home near a different small town in the county next to where I grew up. Over there, it was back to dialup at home, but I had faster access at work. I didn t much care for this, and thus began a 20+-year effort to get broadband in the country. At first, I got a wireless link, which worked well enough in the winter, but had serious problems in the summer when the trees leafed out. Eventually DSL became available locally highly unreliable, but still, it was something. Then I moved back to the community I grew up in, a few miles from where I grew up. Again I got DSL a bit better. But after some years, being at the end of the run of DSL meant I had poor speeds and reliability problems. I eventually switched to various wireless ISPs, which continues to the present day; while people in cities can get Gbps service, I can get, at best, about 50Mbps. Long-distance fees are gone, but the speed disparity remains.

Concluding Reflections I am glad I grew up where I did; the strong community has a lot of advantages I don t have room to discuss here. In a number of very real senses, having no local services made things a lot more difficult than they otherwise would have been. However, perhaps I could say that I also learned a lot through the need to come up with inventive solutions to those challenges. To this day, I think a lot about computing in remote environments: partially because I live in one, and partially because I enjoy visiting places that are remote enough that they have no Internet, phone, or cell service whatsoever. I have written articles like Tools for Communicating Offline and in Difficult Circumstances based on my own personal experience. I instinctively think about making protocols robust in the face of various kinds of connectivity failures because I experience various kinds of connectivity failures myself.

(Almost) Everything Lives On In 2002, Gopher turned 10 years old. It had probably been about 9 or 10 years since I had first used Gopher, which was the first way I got on live Internet from my house. It was hard to believe. By that point, I had an always-on Internet link at home and at work. I had my Alpha, and probably also at least PCMCIA Ethernet for a laptop (many laptops had modems by the 90s also). Despite its popularity in the early 90s, less than 10 years after it came on the scene and started to unify the Internet, it was mostly forgotten. And it was at that moment that I decided to try to resurrect it. The University of Minnesota finally released it under an Open Source license. I wrote the first new gopher server in years, pygopherd, and introduced gopher to Debian. Gopher lives on; there are now quite a few Gopher clients and servers out there, newly started post-2002. The Gemini protocol can be thought of as something akin to Gopher 2.0, and it too has a small but blossoming ecosystem. Archie, the old FTP search tool, is dead though. Same for WAIS and a number of the other pre-web search tools. But still, even FTP lives on today. And BBSs? Well, they didn t go away either. Jason Scott s fabulous BBS documentary looks back at the history of the BBS, while Back to the BBS from last year talks about the modern BBS scene. FidoNet somehow is still alive and kicking. UUCP still has its place and has inspired a whole string of successors. Some, like NNCP, are clearly direct descendents of UUCP. Filespooler lives in that ecosystem, and you can even see UUCP concepts in projects as far afield as Syncthing and Meshtastic. Usenet still exists, and you can now run Usenet over NNCP just as I ran Usenet over UUCP back in the day (which you can still do as well). Telnet, of course, has been largely supplanted by ssh, but the concept is more popular now than ever, as Linux has made ssh be available on everything from Raspberry Pi to Android. And I still run a Gopher server, looking pretty much like it did in 2002. This post also has a permanent home on my website, where it may be periodically updated.

12 July 2020

Enrico Zini: Police brutality links

I was a police officer for nearly ten years and I was a bastard. We all were.
We've detected that JavaScript is disabled in your browser. Would you like to proceed to legacy Twitter?
As nationwide protests over the deaths of George Floyd and Breonna Taylor are met with police brutality, John Oliver discusses how the histories of policing ...
La morte di Stefano Cucchi avvenne a Roma il 22 ottobre 2009 mentre il giovane era sottoposto a custodia cautelare. Le cause della morte e le responsabilit sono oggetto di procedimenti giudiziari che hanno coinvolto da un lato i medici dell'ospedale Pertini,[1][2][3][4] dall'altro continuano a coinvolgere, a vario titolo, pi militari dell Arma dei Carabinieri[5][6]. Il caso ha attirato l'attenzione dell'opinione pubblica a seguito della pubblicazione delle foto dell'autopsia, poi riprese da agenzie di stampa, giornali e telegiornali italiani[7]. La vicenda ha ispirato, altres , documentari e lungometraggi cinematografici.[8][9][10]
La morte di Giuseppe Uva avvenne il 14 giugno 2008 dopo che, nella notte tra il 13 e il 14 giugno, era stato fermato ubriaco da due carabinieri che lo portarono in caserma, dalla quale venne poi trasferito, per un trattamento sanitario obbligatorio, nell'ospedale di Varese, dove mor la mattina successiva per arresto cardiaco. Secondo la tesi dell'accusa, la morte fu causata dalla costrizione fisica subita durante l'arresto e dalle successive violenze e torture che ha subito in caserma. Il processo contro i due carabinieri che eseguirono l'arresto e contro altri sei agenti di polizia ha assolto gli imputati dalle accuse di omicidio preterintenzionale e sequestro di persona[1][2][3][4]. Alla vicenda dedicato il documentario Viva la sposa di Ascanio Celestini[1][5].
Il caso Aldrovandi la vicenda giudiziaria causata dall'uccisione di Federico Aldrovandi, uno studente ferrarese, avvenuta il 25 settembre 2005 a seguito di un controllo di polizia.[1][2][3] I procedimenti giudiziari hanno condannato, il 6 luglio 2009, quattro poliziotti a 3 anni e 6 mesi di reclusione, per "eccesso colposo nell'uso legittimo delle armi";[1][4] il 21 giugno 2012 la Corte di cassazione ha confermato la condanna.[1] All'inchiesta per stabilire la cause della morte ne sono seguite altre per presunti depistaggi e per le querele fra le parti interessate.[1] Il caso stato oggetto di grande attenzione mediatica e ha ispirato un documentario, stato morto un ragazzo.[1][5]
Federico Aldrovandi (17 July 1987 in Ferrara 25 September 2005 in Ferrara) was an Italian student, who was killed by four policemen.[1]
24 Giugno 2020

22 August 2017

John Goerzen: The Eclipse

Highway US-81 in northern Kansas and southern Nebraska is normally a pleasant, sleepy sort of drive. It was upgraded to a 4-lane road not too long ago, but as far as 4-lane roads go, its traffic is typically light. For drives from Kansas to South Dakota, it makes a pleasant route. Yesterday was eclipse day. I strongly suspect that highway 81 had more traffic that day than it ever has before, or ever will again. For nearly the entire 3-hour drive to Geneva, NE, it was packed though mostly still moving at a good speed. And for our entire drive back, highway 81 and every other southbound road we used was so full it felt like rush hour in Dallas. (Well, not quite. Traffic was still moving.) I believe scenes like this were played out across the continent. I ve been taking a lot of photos, and writing about our new baby Martha lately. Now it s time to write a bit about some more adventures with Jacob and Oliver they re now in third and fifth grades in school. We had been planning to fly, and airports I called were either full, or were planning to park planes in the grass, or even shut down some runways to use for parking. The airport in the little town of Beatrice, NE (which I had visited twice before) was even going to have a temporary FAA control tower. At the last minute, due to some storm activity near home at departure time, we unloaded the plane and drove instead. The atmosphere at the fairgrounds in Geneva was festive. One family had brought bubbles for their kids and extras to share. IMG_20170821_113229 I had bought the boys a book about the eclipse, which they were reading before and during the event. They were both great, safe users of their eclipse glasses. IMG_20170821_124809 Jacob caught a toad, and played with it for awhile. He wanted to bring it home with us, but I convinced him to let me take a picture of him with his toad friend instead. IMG_20170821_124553 While we were waiting for totality, a number of buses from the local school district arrived. So by the time the big moment arrived, we could hear the distant roar of delight and applause from the school children gathered at the far end of the field, plus all the excitement nearby. Both boys were absolutely ecstatic to be witnessing it (and so was I!) Wow! Awesome! And simple cackles of delight were heard. On the drive home, they both kept talking about how amazing it was, and it was once in a lifetime. We enjoyed our eclipse neighbors the woman from San Antonio next to us, the surprise discovery of another family from just a few miles from us parked two cars down, even running into relatives at a restaurant on the way home. The applause from all around when it started and when it ended. And the feeling, which is hard to describe, of awe and amazement at the wonders of our world and our universe. There are many problems with the world right now, but somehow there s something right about people coming together from all over to enjoy it.

5 January 2016

Benjamin Mako Hill: Celebrate Aaron Swartz in Seattle (or Atlanta, Chicago, Dallas, NYC, SF)

I m organizing an event at the University of Washington in Seattle that involves a reading, the screening of a documentary film, and a Q&A about Aaron Swartz. The event coincides with the third anniversary of Aaron s death and the release of a new book of Swartz s writing that I contributed to. aaronsw-tiob_bwcstw The event is free and open the public and details are below:

WHEN: Wednesday, January 13 at 6:30-9:30 p.m.

WHERE: Communications Building (CMU) 120, University of Washington

We invite you to celebrate the life and activism efforts of Aaron Swartz, hosted by UW Communication professor Benjamin Mako Hill. The event is next week and will consist of a short book reading, a screening of a documentary about Aaron s life, and a Q&A with Mako who knew Aaron well details are below. No RSVP required; we hope you can join us.

Aaron Swartz was a programming prodigy, entrepreneur, and information activist who contributed to the core Internet protocol RSS and co-founded Reddit, among other groundbreaking work. However, it was his efforts in social justice and political organizing combined with his aggressive approach to promoting increased access to information that entangled him in a two-year legal nightmare that ended with the taking of his own life at the age of 26.

January 11, 2016 marks the third anniversary of his death. Join us two days later for a reading from a new posthumous collection of Swartz s writing published by New Press, a showing of The Internet s Own Boy (a documentary about his life), and a Q&A with UW Communication professor Benjamin Mako Hill a former roommate and friend of Swartz and a contributor to and co-editor of the first section of the new book. If you re not in Seattle, there are events with similar programs being organized in Atlanta, Chicago, Dallas, New York, and San Francisco. All of these other events will be on Monday January 11 and registration is required for all of them. I will be speaking at the event in San Francisco.

16 January 2015

Erich Schubert: Year 2014 in Review as Seen by a Trend Detection System

We ran our trend detection tool Signi-Trend (published at KDD 2014) on news articles collected for the year 2014. We removed the category of financial news, which is overrepresented in the data set. Below are the (described) results, from the top 50 trends (I will push the raw result to appspot if possible due to file limits). The top 10 trends are highlighted in bold.
January
2014-01-29: Obama's State of the Union address
February
2014-02-05..23: Sochi Olympics (11x, including the four below)
2014-02-07: Gay rights protesters arrested at Sochi Olympics
2014-02-08: Sochi Olympics begins
2014-02-16: Injuries in Sochi Extreme Park
2014-02-17: Men's Snowboard cross finals called of because of fog
2014-02-19: Violence in Ukraine and Kiev
2014-02-22: Yanukovich leaves Kiev
2014-02-23: Sochi Olympics close
2014-02-28: Crimea crisis begins
March
2014-03-01..06: Crimea crisis escalates futher (3x)
2014-03-08: Malaysia Airlines machine missing in South China Sea (2x)
2014-03-18: Crimea now considered part of Russia by Putin
2014-03-28: U.N. condemns Crimea's secession
April
2014-04-17..18: Russia-Ukraine crisis continues (3x)
2014-04-20: South Korea ferry accident
May
2014-05-18: Cannes film festival
2014-05-25: EU elections
June
2014-06-13: Islamic state fighting in Iraq
2014-06-16: U.S. talks to Iran about Iraq
July
2014-07-17..19: Malaysian airline shot down over Ukraine (3x)
2014-07-20: Israel shelling Gaza kills 40+ in a day
August
2014-08-07: Russia bans EU food imports
2014-08-20: Obama orders U.S. air strikes in Iraq against IS
2014-08-30: EU increases sanctions against Russia
September
2014-09-04: NATO summit
2014-09-23: Obama orders more U.S. air strikes against IS
Oktober
2014-10-16: Ebola case in Dallas
2014-10-24: Ebola patient in New York is stable
November
2014-11-02: Elections: Romania, and U.S. rampup
2014-11-05: U.S. Senate elections
2014-11-25: Ferguson prosecution
Dezember
2014-12-08: IOC Olympics sport additions
2014-12-11: CIA prisoner center in Thailand
2014-12-15: Sydney cafe hostage siege
2014-12-17: U.S. and Cuba relations improve unexpectedly
2014-12-19: North Korea blamed for Sony cyber attack
2014-12-28: AirAsia flight 8501 missing

28 September 2014

Ean Schuessler: RoboJuggy at JavaOne

A few months ago I was showing my friend Bruno Souza the work I had been doing with my childhood friend and robotics genius, David Hanson. I had been watching what David was going through in his process of creating life-like robots with the limited industrial software available for motor control. I had suggested to David that binding motors to Blender control structures was a genuinely viable possibility. David talked with his forward looking CEO, Jong Lee, and they were gracious enough to invite me to Hong Kong to make this exciting idea a reality. Working closely the HRI team (Vytas, Gabrielos, Fabien and Davide) with David s friend and collaborators at OpenCog (Ben Goertzel, Mandeep, David, Jamie, Alex and Samuel) a month long creative hack-fest yielded pretty amazing results. Bruno is an avid puppeteer, a global organizer of java user groups and creator of Juggy the Java Finch, mascot of Java users and user groups everywhere. We started talking about how cool it would be to have a robot version of Juggy. When I was in China I had spent a little time playing with Mark Tilden s RSMedia and various versions of David s hobby servo based emotive heads. Bruno and I did a little research into the ROS Java bindings for the Robot Operating System and decided that if we could make that part of the picture we had a great and fun idea for a JavaOne talk. Hunting and gathering I tracked down a fairly priced RSMedia in Alaska, Bruno put a pair of rubber Juggy puppet heads in the mail and we were on our way.
We had decided that we wanted RoboJuggy to be able to run about untethered and the new RaspberryPi B+ seemed like the perfect low power brain to make that happen. I like the Debian based Raspbian distributions but had lately started using the netinst Pi images. These get your Pi up and running in about 15 minutes with a nicely minimalistic install instead of a pile of dependencies you probably don t need. I d recommend anyone interested I m duplicating our work to stay their journey there: Raspbian UA Net Installer Robots seem like an embedded application but ROS only ships packages for Ubuntu. I was pleasantly surprised that there are very good instructions for building ROS from source on the Pi. I ended up following these instructions: Setting up ROS Hydro on the Raspberry Pi Building from source means that all your install ends up being isolated (in ROS speak) and your file locations and build instructions end up being subtly current. As explained in the linked article, this process is also very time consuming. One thing I would recommend once you get past this step is to use the UNIX dd command to back up your entire SD card to a desktop. This way if you make a mess of things in later steps you can restore your install to a pristine Raspbian+ROS install. If your SD drive was on /dev/sdb you might use something like this to do the job:
sudo dd bs=4M if=/dev/sdb   gzip > /home/your_username/image date +%d%m%y .gz
Getting Java in the mix Once you have your Pi all set up with minimal Raspbian and ROS you are going to want a Java VM. The Pi runs the ARM CPU so you need the corresponding version of Java. I tried getting things going initially with OpenJDK and I had some issues with that. I will work on resolving that in the future because I would like to have a 100% Free Software kit for this but since this was for JavaOne I also wanted JDK8, which isn t available in Debian yet. So, I downloaded the Oracle JDK8 package for ARM. Java 8 JDK for ARM At this point you are ready to start installing the ROS Java packages. I m pretty sure the way I did this initially is wrong but I was trying to reconcile the two install procedures for ROS Java and ROS Hydro for Raspberry Pi. I started by following these directions for ROS Java but with a few exceptions (you have to click the install from source link in the page to see the right stuff: Installing ROS Java on Hydro Now these instructions are good but this is a Pi running Debian and not an Ubuntu install. You won t run the apt-get package commands because those tools were already installed in your earlier steps. Also, this creates its own workspace and we really want these packages all in one workspace. You can apparently chain workspaces in ROS but I didn t understand this well enough to get it working so what I did was this:
> mkdir -p ~/rosjava 
> wstool init -j4 ~/rosjava/src https://raw.github.com/rosjava/rosjava/hydro/rosjava.rosinstall
> source ~/ros_catkin_ws/install_isolated/setup.bash > cd ~/rosjava # Make sure we've got all rosdeps and msg packages.
> rosdep update 
> rosdep install --from-paths src -i -y
and then copied the sources installed into ~/rosjava/src into my main ~/ros_catkin_ws/src. Once those were copied over I was able to run a standard build.
> catkin_make_isolated --install
Like the main ROS install this process will take a little while. The Java gradle builds take an especially long time. One thing I would recommend to speed up your workflow is to have an x86 Debian install (native desktop, QEMU instance, docker, whatever) and do these same build from source installs there. This will let you try your steps out on a much faster system before you try them out in the Pi. That can be a big time saver. Putting together the pieces Around this time my RSMedia had finally showed up from Alaska. At first I thought I had a broken unit because it would power up, complain about not passing system tests and then shut back down. It turns out that if you just put the D batteries in and miss the four AAs that it will kind of pretend to be working so watch for that mistake. Here is a picture of the RSMedia when it first came out of the box: wpid-20140911_142904.jpg Other parts were starting to roll in as well. The rubber puppet heads had made their way through Brazilian customs and my Pololu Mini Maestro 24 had also shown up as well as the my servo motors and pan and tilt camera rig. I had previously bought a set of 10 motors for goofing around so I bought the pan and tilt rig by itself for about $5(!) but you can buy a complete set for around $25 from a number of EBay stores. Complete pan and tilt rig with motors for $25 A bit more about the Pololu. This astonishing little motor controller costs about $25 and gives you control of 24 motors with an easy to use and high level serial API. It is probably also possible to control these servos directly from the Pi and eliminate this board but that will be genuinely difficult because of the real-time timing issues. For $25 this thing is a real gem and you won t regret buying it. Now it was time to start dissecting the RSMedia and getting control of its brain. Unfortunately a lot of great information about the RSMedia has floated away since it was in its heyday 5 years ago but there is still some solid information out there that we need to round up and preserve. A great resource is the SourceForge based website here at http://rsmediadevkit.sourceforge.net. That site has links to a number of useful sites. You will definitely want to check out their wiki. To disassemble the RSMedia I followed their instructions. I will say, it would be smart to take more pictures as you are going because they don t take as many as they should. I took pictures of each board and its associated connections as dismantled the unit and that helped me get things back together later. Another important note is that if all you want to do is solder onto the control board and not replace the head then its feasible to solder the board in place without completely disassembling the unit. Here are some photos of the dis-assembly: wpid-20140921_114742.jpg wpid-20140921_113054.jpg wpid-20140921_112619.jpg Now I also had to start adjusting the puppet head, building an armature for the motors to control it and hooking it into the robot. I need to take some more photos of the actual armature. I like to use cardboard for this kind of stuff because it is so fast to work with and relatively strong. One trick I have also learned about cardboard is that if you get something going with it and you need it to be a little more production strength you can paint it down with fiberglass resin from your local auto store. Once it dries it becomes incredibly tough because it soaks through the fibers of the cardboard and hardens around them. You will want to do this in a well ventilated area but its a great way to build super tough prototypes. Another prototyping trick I can suggest is using a combination of Velcro and zipties to hook things together. The result is surprisingly strong and still easy to take apart if things aren t working out. Velcro self-adhesive pads stick to rubber like magic and that is actually how I hooked the jaw servo onto the mask. You can see me torturing its first initial connection here: Since the puppet head had come all the way from Brazil I decided to cook some chicken hearts in the churrascaria style while I worked on them in the garage. This may sound gross but I m telling you, you need to try it! I soaked mine in soy sauce, Sriracha and chinese cooking wine. Delicious but I digress. wpid-20140920_191551.jpg As I was eating my chicken hearts I was also connecting the pan and tilt armature onto the puppet s jaw and eye assembly. It took me most of the evening to get all this going but by about one in the morning things were starting to look good! I only had a few days left to hack things together before JavaOne and things were starting to get tight. I had so much to do and had also started to run into some nasty surprises with the ROS Java control software. It turns out that ROS Java is less than friendly with ROS message structures that are not built in . I had tried to follow the provided instructions but was not (and still have not) been able to get that working. Using unofficial messages with ROS Java I still needed to get control of the RSMedia. Doing that required the delicate operation of soldering to its control board. On the board there are a set of pins that provide a serial interface to the ARM based embedded Linux computer that controls the robot. To do that I followed these excellent instructions: Connecting to the RSMedia Linux Console Port After some sweaty time bent over a magnifying glass I had success: wpid-20140921_143327.jpg I had previously purchased the USB-TTL232 accessory described in the article from Dallas awesome Tanner Electronics store in Dallas. If you are a geek I would recommend that you go there and say hi to its proprietor (and walking encyclopedia of electronics knowledge) Jim Tanner. It was very gratifying when I started a copy of minicom, set it to 115200, N, 8, 1, plugged in the serial widget to the RSMedia and booted it up. I was greeted with a clearly recognizable Linux startup and console prompt. At first I thought I had done something wrong because I couldn t get it to respond to commands but I quickly realized I had flow control turned on. Once turned off I was able to navigate around the file system, execute commands and have some fun. A little research and I found this useful resource which let me get all kinds of body movements going: A collection of useful commands for the RSMedia At this point, I had a usable set of controls for the body as well as the neck armature. I had a controller running the industry s latest and greatest robotics framework that could run on the RSMedia without being tethered to power and I had most of a connection to Java going. Now I just had to get all those pieces working together. The only problem is that time was running out and I only had a couple of days until my talk and still had to pack and square things away at work. The last day was spent doing things that I wouldn t be able to do on the road. My brother Erik (and fantastic artist) came over to help paint up the juggy head and fix the eyeball armature. He used a mix of oil paint, rubber cement which stuck to the mask beautifully. I bought battery packs for the USB Pi power and the 6v motor control and integrated them into a box that could sit below the neck armature. I fixed up a cloth neck sleeve that could cover everything. Luckily during all this my beautiful and ever so supportive girlfriend Becca had helped me get packed or I probably wouldn t have made it out the door. Welcome to San Francisco THIS ARTICLE IS STILL BEING WRITTEN

10 November 2012

Martín Ferrari: Amusing ourselves to death

What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. --Foreword to the book. Just finished this book today. It was written in 1985, it talks about Dallas, and the 700 Club, Reagan and many other (by now) old fashioned things. It only talks about computers in passing, as this new thing that everybody is talking about. And it is extremely current. It's been 27 years, and it could have been written yesterday. You'll just need to replace some TV shows with Twitter, Reddit, or G+, but it has made me think a lot about my own relationship with information, and amusement. I couldn't help but think about this book when reading various essays that touch culture, education, or politics these days. It has also made me realise my own incongruence: being proud of not owning a TV, while I spent hours of my waking ours watching cat videos, news snippets in Google Reader, or curating links that have been reshared dozens of times. If you're at all interested in understanding our culture and discourse, do yourself a favor and get a copy. It's not long, nor dense in academic jargon. It's only 9 euro with free shipping in the Book Depository, and they even have a Kindle version in Amazon. Tags: Libros, Planet Debian

11 September 2011

John Goerzen: Mexico Part 1: Genesis and Travel

My family and I (including our boys) are just back from a great visit to Mexico. It was my first time there, and also the first time our boys have been outside the USA. I ll be writing about all the fun stuff in the posts to come, so you ll have to bear with me on this one as I describe why we did something that appears to horrify a segment of Americans. About a year ago, I wrote a review of Travel as a Political Act by Rick Steves. Rick s point wasn t actually directly political in the conventional sense, but summed up as:
I ve taught people how to travel. I focus mostly on the logistics: finding the right hotel, avoiding long lines But that s not why we travel. We travel to have enlightening experiences, to meet inspirational people, to be stimulated, to learn, and to grow. Travel has taught me the fun in having my cultural furniture rearranged and my ethnocentric self-assuredness walloped.
He speaks of giving onesself permission to have a conversation with someone that doesn t speak a language you know, for instance. I got an email this spring from my longtime friend Jonathan Hall, who had moved to Mexico a couple of years ago. He invited my family and me to go to Mexico, generously offered to host us and show us around, and specifically mentioned my review of Rick Steves as a reason to do so, this was immediately intriguing. Due to his other plans, it was also something of a limited-time offer. Those of you that don t live in the USA may not know what sort of stories we get about Mexico over here. Most of them involve either illegal immigration to the USA or the Mexican drug war. Occasionally there is some sort of drug-related violence on the US side of the border as well, which always makes the news. There are a lot of people that have the perception that Mexico is a dangerous place to be. Terah even knows some people that grew up in Mexico and are too afraid to return. So we did some research, asked some questions, and it became pretty clear that yes, some parts of Mexico are dangerous, but many parts are really quite safe, and Jonathan had invited us to one of those. The reactions we got from Americans when we told them of our plan ranged from excitement that we would get to visit a beautiful country to concern and worry about our safety. Besides that, I knew almost no Spanish and Terah had a few high school and college classes years ago to go on. And, we knew that Mexico would, in many ways, be more different from the USA than Germany was. We concluded that this would get us out of our comfort zone in a significant way, have lots of great things to do, be a good experience for the boys, and something that we wouldn t do without Jonathan. So we bought the tickets for it! As you might notice, I m quite glad we did. I ll follow up with the details in the next stories, but for today I ll end with the story of getting there. It involves two cute boys, so of course it ll be interesting. Jacob is almost 5 and Oliver is 2, and neither one of them had flown for over a year. Despite leaving home at 5AM to drive to the airport (about an hour away), they were both awake and alert. Jacob was jabbering away the whole way there. He enjoyed the security process and found it interesting I had to explain that they were checking to make sure everyone was following the airplane rules. Then as we walked to our gate, he pressed his face against every window, looking at the parked airplanes in the dark. Oliver would see him doing this and run over to join him. With a bit of help, he noticed some airplanes had United Express painted on them, and others had American with an AA on them. He would talk about American AA and United Express airplanes for the rest of the week whenever he heard one in the sky. I sat between the boys on our flight to Dallas, and Terah was in the row ahead of us. This is how Jacob spent most of the flight. And Oliver, who had the aisle seat, enjoyed paging through the inflight magazines, safety brochure, wifi instruction card, and airsickness bag. The real highlight came at the Dallas/Ft. Worth (DFW) airport, where we had a connection. And the reason: we needed to take a train to get to our connecting gate. The DFW Skylink system really works very well but the boys cared most that it s a TRAIN. It was difficult to get them both away from it when we got off. They wanted to see it leave, stay and watch the next one come, etc. Jacob was only happy when he realized he could see the Skylink trains running high above the window at our departure gate for Guadalajara. The 2.5-hour flight to Guadalajara got a little long for them, but they managed OK. We struck up several conversations with friendly people that knew English as we waited in various lines. It seems to be something of a rarity for American families with young children and no Mexican heritage to visit Mexico. People went out of their way to be friendly and welcoming even the customs officials. It felt like a great start to the visit. One Mexican man who was chatting with me encouraged me to learn some Spanish. I said my wife knows some, and that I had tried to learn some German back when I took foreign language classes. Laughing, he said, Why would you do that? Not as an insult; it just genuinely didn t seem useful to him. I think they were happy and proud that someone wanted to visit Mexico and was excited about it! More to come.

2 November 2010

Tollef Fog Heen: Temperature logging with 1-wire

Last night, I finally got my temperature sensors going, including a nice and shiny munin plugin giving me pretty graphs. So far, I only have a sensor in the loft, but I'll spend some days putting sensors in the rest of the house as well. Robert McQueen asked me on twitter how this all was set up, so I figured I'd blog about it. The sensors I'm using are the DS18B20 ones from Dallas Semiconductor. You can probably buy them from your local electronics supplier, but mine charges around 75 NOK a piece, so I just bought some off Ebay. It takes a bit longer, but I paid about 1/10th the price. For logging, I'm using my NAS, which is just a machine running Debian, an USB to serial adapter and an serial-to-1-wire adapter. Thanks a lot to Martin Bergek for the writeup and the ELFA part numbers for diodes. Since I'm lazy, I ended up just writing a plugin for munin. It uses owfs, which I downloaded from mentors.debian.net. I also offered sponsorship for it, assuming a few small issues are cleaned up, so hopefully you can install using just Debian in the near future. owfs is fairly easy to work with, and the plugin uses the aliased names if you provide aliases, so you can know what the temperature in a given location is, rather than having to remember 64 bit serial numbers.

15 February 2010

Russell Coker: Links February 2010

Popular Mechanics has a good article about 911 [1]. Experts in all the relevant fields were consulted to debunk popular myths. It s an old article but I hadn t read it before and learned a lot. Former CIA analyst Raw McGovern and former FBI attorney/special agent Coleen Rowley, a colleague in Veteran Intelligence Professionals for Sanity wrote an interesting article titled Why Counter-Terrorism Is in Shambles [2]. Such sanity from people who are associated with the intelligence industry is unusual. Gizmodo has an amusing and informative poster about the true risks of airline travel post 911 [3]. Reuters has an interesting article about drug smugglers using Gulfstream and 727 aircraft to smuggle cocaine from South America to Africa [4]. They claim a link to al Quaeda, but such a link seems tenuous from the evidence provided, it does seem reasonable to claim that groups who claim affiliation to al Quaeda are involved in smuggling anyone can claim anything really. An 8yo boy is on the TSA terrorism watch list , he regularly gets frisked when traveling by air [5]. His mother had a security clearance to fly on Air Force 2 when Al Gore was the Vice President, any sane security system would look at the parents rather than an 8yo child children of that age aren t going to independently become terrorists. The Dallas Observer has an interesting article by Kimberly Thorpe about how to beat debt collectors [6]. Apparently most debt collectors break the law in some way and can be sued for damages with a typical settlement of $3,500. Some debtors are suing multiple debt collectors, after one debt collector is successfully sued the debt is passed to another collector who also breaks the law. What I really like about this is that the community of people who sue debt collectors keep the industry honest and protect the majority of the population who don t have the time or interest for engaging in law suits. Read Write Web has an informative article about SourceForge being forced to deny access to people in Cuba, Iran, North Korea, Sudan and Syria [7]. A problem for free software developers is that we often don t know the location of the people we collaborate with so it s best to be as open as possible. This means that the US is not a good place to host servers, probably some part of the EU would be better. Also this sort of thing makes the field of free software development less welcoming to US citizens. Did the congress people learn nothing in high-school? They should know that someone who starts a campaign of ostracism may end up being in the small group. Google is developing a new Native Client (NaCl) system that seems to be like Microsoft ActiveX [8]. I can t imagine this doing anything that couldn t be done with Java, it seems most likely to just marginalise the less popular platforms which isn t in the best interests of Google. Kevin Kelly of the Technium wrote an interesting post about 1000 true fans [9]. The concept is that if you are doing creative work you only need 1000 dedicated fans who buy everything you sell to make a living. If you make $20 per year from each of the 1000 fans and you will earn enough to live. Make $100 per year from each of the 1000 fans and you will be earning more money than most people. The updates show that artists who try this aren t having much success yet, but the Internet population is still increasing dramatically PaxStreamline offers an innovation in commercial air-conditioning, apparently a significant amount of electricity is wasted on heating the air after chilling it excessively to remove moisture [10]. So instead of cooling it they use a liquid dessicant to extract the moisture. Ben Schwartz explains why you should never create files in H.264 or MPEG formats, unless you have a special commercial license then you (and your viewers) will all be liable for patent infringement for any type of commercial use [11]. Note that storing the data on a web site with Google adverts counts as commercial use. I wonder if all those digital cameras and mobile phones that create MPEG videos have appropriate licenses, maybe uploading a file created on your phone breaches the patent. J. K. Rowling (author of Harry Potter) gave an inspiring speech for Harvard graduates [12]. I particularly liked the following reference to her work for Amnesty International Choosing to live in narrow spaces leads to a form of mental agoraphobia, and that brings its own terrors. I think the willfully unimaginative see more monsters. They are often more afraid . Peter Eigen gave an interesting TED talk about the formation of Transparency International and the economic problems that are caused by corruption [13]. The Monthly Review has an interesting article about the failure of the US justice system [14]. The prison industrial complex has captured part of the US government, neo-liberalism is to blame. 59% of Americans agree that homosexuals ought to be able to serve in the U.S. military. But 70 percent believe that gays and lesbians ought to be able to serve in the military [15]. Apparently 11% of Americans think that gays and lesbians are better than homosexuals .

7 January 2010

Kumar Appaiah: Brief updates and some advice

So, while Richard is busy doing research and arranging or facilitating several Debian workshops, here I am, gulping down Idlis and Dosas by the dozen. It's been a fun visit, and I got to visit several places in my native state, but since my visit to Mumbai is rather short, I won't be able to say hello to several friends this time (Sorry Kartik). Maybe this would happen only when I am back in India on a permanent basis; let's see...Here are some general guidelines for travelers which I'd like to share:It's been a really fun trip, and I am sure I'll enjoy the rest of the trip, till I get back to my old routine and resume work.Finally, in keeping with the spirit of a blog I try to understand (albeit, in vain, I must accept), I duplicate the concept of a post from that blog, here is an up-to-date list of airports in which I have sat, but never left except by way of airplane:More updates later. Bye for now.

11 December 2009

John Goerzen: Graduating and what now?

I guess I ve never been one to do things a certain way just because that s how they re normally done. I took my first college class pretty young, back in 1995, just after my freshman year in high school. By the time of my junior year in high school, I was well on my way through the computer science curriculum at Wichita State, and would probably have graduated from WSU at about age 20 had I not moved to Dallas, and then Indianapolis, for jobs. I did so because I (correctly) thought that in my line of work, experience was more important than a degree. However, playing into that was the incorrect notion that an education was useful primarily as a means to a career a notion that colleges and universities unfortunately have been promoting themselves. After moving back to Kansas, I started work to finish my degree I had effectively one year left had I been a full-time student. But it was difficult doing so, living almost an hour from Wichita, having a full-time job, and then when Jacob was on the way, I stopped classes entirely. Having online classes available to help me finish out the degree has been great. A year ago, I re-thought the value of education to me, concluding that there is a lot of value in education for its own sake. Looking at education as little more than a means to a career is the wrong way to approach it. Since I have had that re-evaluation, I ve adjusted my path even at this late point, and have enjoyed my classes more than I ever had before because of it. Graduating! So anyhow, here I am, graduating in a few days, 14 years after I started. I ve had some good news this semester, too: I ll pass the requirements to be a graduate of the WSU honors program, and also to graduate magna cum laude. So it may have taken me 14 years to do finish college, but at least I ve done it well . It feels weird to be about to be done with college, after so many years of being almost done, except for And a question I ve been thinking about a lot is: What next? Obviously I ll start by taking a break from college stuff. I want to be able to spend more time with Terah, Oliver, and Jacob, and there are things around the house that need attention. But after that? I figure I have a few options: No more college, but some reading. I do enjoy reading, and especially reading older works that are nicely out of copyright and thus free. I ve read few of the classics, but I enjoy reading them, and want to be better informed about them too. That could be fun. But sometimes it s hard to know how to place things in context. For instance, I don t think I could have gotten as much out of reading some famous philosophers as I did by doing so in a class that put them all in context and discussed criticisms of different positions. I also wouldn t necessarily have the ability to discuss and debate ideas with my classmates. Take the occasional undergraduate class in a topic that interests me. I m especially interested in history and literature, and haven t had much exposure at the college level in either. Various online courses are available from universities in Kansas on those topics. I could take a course or two each year and become a better-informed, more well-rounded person, and study an interesting topic at the same time. Maybe this would eventually lead to a degree, and maybe not. It would be fine either way. Graduate studies. It s a lot of work, and I m not sure it would be all that interesting to me. I ve looked some at curricula. I don t think that I m interested in computer science as a graduate program; I enjoy coding more than theorizing and analysis. But I also don t think that a graduate MIS or some such degree is for me; most of them appear to either focus on Windows or cover things that I ve already known how to do for years, and thus I wonder just what the point of them is (in addition to questioning how rigorous the curriculum is). I could also, down the road, pursue a graduate degree in something like English, philosophy, or history, but it seems a natural path to that would be to take more undergraduate classes in those topics first. So for now, I ll sit tight. I ll see how I feel by summer, how pressed for time I feel at home, and if I still feel the desire to take more classes. If so, I suspect I ll try to take some undergraduate classes in whatever sounds interesting to me at the time. In a way, that is quite freeing take a class on whatever topic I like, without a defined degree in mind. I ve never been free to do that before, and I like the idea of it.

6 December 2009

Ian Wienand: Distance to 1 million people

On a recent trip up the Oregon coast, a friendly doorman at our hotel in Portland was inquiring about our trip. When we mentioned we passed through Bandon, OR, he quipped that Bandon was the place furthest from a city of one million people in the USA. I guess a normal person would just think "oh, that's interesting" and move on, but it has been plaguing me ever since. Firstly, I had to find what cities in the USA had more than 1 million people. Luckily Wolfram Alpha gives the answer: (I certainly wouldn't have guessed that list!) From there my plan was to find the bounding box of the continental USA; luckily Wikipedia has the raw data for that. Combined with the latitude and longitude of the cities above, I had the raw data. I couldn't figure out any way better than a simple brute-force of testing every degree and minute of latitude and longitude within the bounding box and calculating the distance to the closest large city; the theory being that from one particular point you would have to travel further than any other to reach a city of 1 million people. Luckily, that is short work for a modern processor, and hopefully the result would be a point somewhere around Bandon. I'd already become acquainted with the great circle and measuring distances when I did Tinymap, so a quick python program evolved. However, it turns out that the program picks the far south-east corner of the bounding box. Thanks to the shape of the USA, that is way out over the ocean somewhere. I can't figure out a way to get an outline of the USA to test if a given point is inside the border or not, but all is not lost. I modified the program to output the the distance to the closest large city along with the location to a log file, and then imported it into gnuplot to make a heat-map. The hardest part was finding an equirectangular outline of the USA to place the heat-map over, rather than a much more common Mercator projection; Wikimedia to the rescue! I actually surprised myself at how well the two lined up when, after a little work with Gimp, I overlayed them (big) Distance to a city of 1 million people (km) From this, I can see that Bandon, about a third of the way up the Oregon coast, is a pretty good candidate. However, probably not the best; I get the feeling the real point that is the furthest from any city of 1 million people is actually somewhere in the central-middle of Montana. However, we can also fiddle the program slightly to disprove the point about Bandon. The numbers show the closest large city to Bandon is LA, at ~1141km. Taking another point we suspect to be more remote; the closest large city to Portland (where we met the doorman) is also LA at ~1329km. So to reach the closest large city you have to travel further from Portland than Bandon, so Bandon is not the furthest place in the USA from a city of one million people. Myth busted!

24 November 2009

Lucas Nussbaum: UDS Lucid

I m back from Dallas, where I was invited at the Ubuntu Developer Summit for Lucid. I spent a great week there ; the event was extremely well organized (by organizing them every 6 months, you are probably able to gather a lot of experience!). Of course, after all I had heard about people hugging each other all the time in the Ubuntu community, I was a bit worried, especially with the flu spreading! But there are lots of fantastic people around Ubuntu, and it was a very nice opportunity to be able to meet them all. Since it was my first UDS (I was at FOSSCAMP in Prague a few years ago, but didn t stay for UDS back then), I was not really sure of what to expect. I was very pleasantly surprised. UDS vs Debconf UDS is very different from Debconf. In Debconf, we do three different kind of things: In UDS, the main focus is on the third point: most of the sessions are about discussing what will be implemented for the next release. All of the relevant developers are in the same room to discuss possible problems, and the outcome of each session is usually a detailed plan, with a list of action items. It s a very nice way to ensure that changes are well thought, and allows making large-scale changes in Ubuntu very easily (you don t spend weeks arguing about them on mailing lists). Of course, it s probably also helped by the fact that there s a company behind Ubuntu, with a set of large teams (kernel, foundations, desktop, etc), which helps transfering trust (not everybody feel like they have to participate in each discussion, even when they affect the whole distribution: the team in charge is trusted by the rest of the project). On the other hand (yeah, let s be negative for a while) it doesn t really help spreading information between Ubuntu developers: it s often a bit difficult to get the global view of what is happening inside Ubuntu, especially since lots of things are discussed on IRC. Collaboration between Debian and Ubuntu During the week, I mainly was interested on collaboration between Debian and Ubuntu. There s a strong focus on doing the right thing wrt Debian (and also other upstreams). Then, of course, Ubuntu also has an agenda, which sometimes requires moving very fast on some things, or making compromises between technical purity and pragmatism. But the willingness to have common foundations between Debian Squeeze and Ubuntu Lucid will surely benefit both distros. Quality Assurance On the QA front, I am planning to do archive rebuilds for Ubuntu as well (fixing FTBFS is an easy way to start contributing to Ubuntu or Debian, and having those bugs fixed in Lucid would benefit Debian as well, by having patches already prepared). I also had a session with the QA team, where I gave an overview of what we are doing in the Debian QA group, to discuss opportunities for collaboration. The Ubuntu QA team focuses more on testing (with automated or manual testing) and bug triaging than archive quality that part is left to the MOTUs and the release managers. (About MOTUs, I liked how what they do was described as long-tail maintenance, landscape gardening or terraforming. That gives a good idea of what it s about). Ultimate Debian Database On the Ultimate Debian Database front, I did a plenary talk to try to demonstrate how UDD could be useful to Ubuntu as well, and, with Jorge Castro, we examined some metrics of Ubuntu s giving back to Debian. I also talked with the Launchpad team to try to resolve my long standing pretty please provide an export of Ubuntu bugs, so I can easily import them in UDD! issue. Ubuntu and ARM ARM netbooks and smartbooks (mix between netbooks and smartphones) are coming, and Ubuntu is clearly very well positioned to play an important role on that market. There was a whole track about ARM support, with lots of changes that will be done for Lucid. Let s all hope that Ubuntu-powered ARM netbooks win that market, so we don t reproduce the failures of the non-ARM netbooks. Distributed Development James Westby has been working on a set of tools to be able to work on Ubuntu packaging using bzr. The point is not to store the canonical source for Ubuntu packages in bzr (well, at least it s not the plan yet), but to provide a set of branches to make it easier to merge or cherry-pick from Debian. The resulting workflow looks extremely nice, with lots of syntaxic sugar. And even better, he assured me that his code is portable to Git ;)
Using his work, merging Ubuntu-specific changes in a new version of a Debian package basically means pulling from lp:ubuntu/foopkg, merging from lp:debian/sid/foopkg, and you are done!
As a bonus, we (Debian) would get bzr branches with the history of packages (kind of bzr-powered snapshots.debian.org).
James project is not completely ready yet, but should be very soon. It s already basically usable, apparently. Conclusion Ubuntu has clearly gone a very long way since 2004. Everything looks very well organized and polished, and gives the impression of a big machine that nothing can stop. With Cloud Computing and now ARM netbooks, Ubuntu has proven to be able to adapt to the current trends and attract a lot of visibility. It is great news for Free Software, but also proposes an interesting challenge to Debian: of course, it s nice that a Debian-based distro is in that position, but will Debian manage to stay relevant, or are we just going to be the technically-pure distro without many users that serves as a package supermarket for Ubuntu?

16 November 2009

Andrew Pollock: [life] Blogging on a plane!

Well, here I am, sitting in a chair at 10,000 feet (or whatever the altitude currently is), using WiFi. Writing a blog post. And we still don't have flying cars. I'm on my way to Dallas for the Ubuntu Developer Summit for the 10.04 (Lucid Lynx) release. Being an LTS release, this is of particular interest for what I do at work. Sarah's still in Australia (well she's actually on a flight to LAX as I write), so I took the VTA light rail + shuttle to the airport. I must say, aside from not being particularly speedy, it was a pleasant experience. I've finally cracked open The Audacity of Hope, which I received for Christmas or my birthday last year. I'm really loving the renovated San Jose airport. Now that all of the check-in counters have moved downstairs, they've about quadrupled the space the TSA has, which makes getting through security a much more pleasant experience. Add to that, the nice lady in the Admiral's Club kindly reseating me in an exit row, all by myself, and this is a pretty sweet trip so far. http://www.youtube.com/watch?v=8r1CZTLk-Gk

7 November 2009

Andrew Pollock: [life] Whirlwind visit to Brisbane

Sarah's Mum is flying her back for the scattering of her grandmother's ashes, and I figured that as this will be her third trip back this year, and she's seen my family more than I have in the last 12 months, I should come as well. So I'm getting my first opportunity to sample V Australia's service. I must say that flying Virgin America to LAX and then transferring to V Australia to fly direct to Brisbane seems like a fairly civilised way to do it. Anything that involves Virgin America is always a delight. My only complaints so far are that the SFO-LAX flight left late, and the check in line for V Australia in LAX was ridiculously slow given it was so short. The time of the flight is pretty good - it leaves LAX at 10:30pm, so hopefully we'll get a semi-decent amount of sleep. It gets into Brisbane at 6:30am on Sunday, so we'll have to try and imitate the living dead for the day. I'm heading back again on Saturday, as I have to be in Dallas next week for the Ubuntu Developer Summit.

8 September 2007

Christian Perrier: RWC: first day

First day of the Rugby World Cup around here. Here are some of my impressions (for those of you not aware, I'm a rugby fan for ages and this event in my country will definitely affect my work on free software in the next 6 weeks). France-Argentina (12-17): being beaten by Argentina is a surprise only for people who don't know that much about Argentina in rugby. Argentina is currently ranked 6th in the world national team ranking, very close to Ireland (5th) and before England (7th), the last RWC winner. The argentinians played the best rugby they could play against a very hesitating French team. All players know each other very well as 2/3 of the Argentinian team plays...in France. The world's top player (imho), Juan Mart n Hern ndez, lead his team to victory while the french were anything but brilliant. Nothing really harmful, indeed: in rugby, completely screwing a match happens and does not necessarily mean anything for other games. Only drawback, if France qualifies, it becomes more and more likely that the 1/4 final could be France-New Zealand. Ouch... New Zealand-Italia (76-14): I haven't seen that one, therefore missing the first "haka" of that RWC. All Blacks are still the favourites of the RWC and they scored the quickest try as of now. Italy team fought quite well, from what I read, with one of my favourite players (Maurizio Bergamasco) scoring a try. Australia-Japan (91-3): in the past, Japan was closer to the top rugby world teams, particularly because of a long tradition of playing 7-players rugby and therefore being very efficient in rear lanes play. However, rugby changed last years and the power is the key. The japanese team was obviously lacking it...and then they had 14 tries scored against them by a very strict Australian team. England-USA (28-3 as I'm writing this): England is not that impressive but serious, still. The USA team plays well around the scrum and defend pretty well. On the English side, Mike Catt could be one of the stars of that RWC. And, yay, while writing this, USA scored a try against England *and* Lawrence Dallaglio got expelled for 10 minutes. Two occasions for me to be happy..(sorry in advance to my friends in England). Tomorrow, we'll attend South-Africa vs Samoa at Parc des Princes, Paris. That will definitely be a great feast and a nice rugby match. We'll of course put our heart on the Samoan side (I'm afraid I've never been enthusiast with the South African way to play rugby).

22 August 2007

Guido Trotter: LISA, I m coming! :)

It’s official: I will be representing my team at USENIX’s 21st Large Installation System Administration Conference (LISA) in Dallas (TX), speaking about the project we’re working on! My talk will be on Thursday 15th November! Check out the program: http://www.usenix.org/events/lisa07/tech/#thursday Wow, I’m so happy! :)

27 April 2007

John Goerzen: Online Internet Bank Review

Back in mid-2000, I set up my first Internet bank account. I have been using the same bank ever since. The whole notion of Internet banks has changed a lot since then, since virtually every bank offers varying degrees of online access these days. Lately I checked on banks to see if I was still using the best one. Here are my reviews of the two banks.

Internet Banks

Let's first talk about what makes a bank an "Internet bank" or "online bank". So many banks let you manage your account online now, that the distinction is blurry. But generally, I would say that unique features of Internet banks are:



There are a few to highlight.

Real-time transaction posting is a great thing to have. If you go to the website and transfer money from your savings to your checking account, you can go to the ATM *now* and withdraw it, and when you get back to your desk, the ATM transaction shows up. There are none of these confusing "business day" rules with cutoff times that traditional banks seem to love. And none of this annoying "process all withdrawals first" business that large traditional banks (hello, Bank of America) love to do in order to screw with customers.

High interest rates. Don't even bother with an online bank if the savings interest rate is less than 3% (or checking less than 1%, in my opinion). It is almost impossible to find a traditional bank with these rates.

Note: if you're reading this after April, 2007, these guidelines may no longer apply due to interest rate changes.

Electronic transfers from other banks. Another great feature. A number of banks let you set up, and then initiate, electronic transfers to and from your accounts at other banks. This is usually free if you are using the feature to transfer money into your account, and with a flat per-transaction fee (usually about the same as a foreign ATM fee) for transferring money out.

Ability to do all business without visiting a branch. Not only should this be possible, but routine business (such as making deposits) should be easy to do without visiting a branch. You should be able to order new checks online.

Now let's look at the individual banks.

First Internet Bank of Indiana

FirstIB (they provide nationwide service, despite the name) is the bank I signed up with in 2000. I had found a bank in Dallas that I really liked, but when I moved to Indianapolis, I tried two and liked neither. They all seemed to have really poor interest rates, offer few useful services, and fees for so many conditions that it was hard to avoid them. (More on that in the traditional banks section below)

I looked at the online banking scene at the time, and wound up opening a checking and savings account with FirstIB. FirstIB started as an online-only bank, but recently acquired a bricks-and-mortar bank as well.

I found FirstIB's customer service at the time to be top-notch; I would dial their toll-free number, select *one* option from a menu, and a rep would answer *immediately*. (Take that, Bank of America, with your 45-minute hold times...) Since FirstIB was based in Indianapolis, I actually visited their offices a couple of times. They had a locked lobby. You picked up a phone, pressed 0, and went into the same queue that callers to the 800 number did, then just asked them to come to the lobby and transact your business. (They would not give out cash there, but could take new account paperwork and deposits and the like)

Their interest rates have consistently outperformed traditional banks, and they offer all the online banking amenities listed above. They did not offer online statements at first, but these days online statements are the norm (there is a fee if you want paper statements). Check ordering is accomplished using a linking with the Deluxe check printers. The web interface uses Digital Insight, and the ACH transfer interface uses CashEdge.

FirstIB will actually reimburse you, at the end of the month, for ATM fees that other banks charge you, up to $6/mo. They also provide an unlimited supply of free postage-paid deposit envelopes, and have a PDF of deposit slips that you can print out should you run out.

FirstIB's customer service has slipped in recent times. I know of some people that have been impacted by some errors that FirstIB has made (simply not processing paperwork that they should have processed). Wait times have gone up for the phone line, and a lot of the great "small bank" feel (where they actually knew their customers, and you might actually know the reps) is gone.

Presidential Bank, FSB

Presidential Bank is the bank I'm looking at now. I was initially drawn to them by their extraordinarly high interest rates (5.25% for savings!), but some other things have convinced me to use them.

Initially, I was suspicious. Presidential is a bricks-and-mortar operation that has been around for awhile, and was one of the very first Internet banks. Their website says that they started offering Internet accounts in 1995, and very much looks like it hasn't been updated since. (Thankfully, the account management interface is more modern). I almost thought the site was a fraud up front, a "too good to be true" deal with the high interest rates and bad-looking website. But after doing some research, realized that it is quite real.

Before setting up accounts with them, I had a few questions (such as whether or not transactions post in real time). Presidential has a simple email address for people that want to send in questions (unlike FirstIB's obvious queue system). I sent an email, and got a response an hour or two later, from a real person with a first and last name, and a personal e-mail address I could use to follow up. Nice. I also had occasion to talk to people on the phone. They were polite, speedy, and helpful.

So I set up my accounts with them. I activated the online access, bill payment, and money transfer features. All worked as expected. I did have a couple more occasions to call them after the account was set up (a more crucial test than the pre-account calls), and they remained helpful. Strangely, this bank, which seems to be a much larger operation than FirstIB, had a more small bank feel. Several times, as I started talking to a rep, he or she would say something like "Oh, I remember processing the setup paperwork on this account last week" or "I think I talked to you about this before?"

Customer service is one of those things you rarely need after the first month or so of an account. But with your bank, it is vital that the reps know what they are doing and are willing to help you when you call -- which Presidential is doing better than FirstIB these days.

Presidential does not rebate ATM fees like FirstIB does. They do provide a few deposit envelopes with your initial account setup packet, but they are not postage paid or unlimited. However, because Presidential's interest rates are so much better than FirstIB's, it doesn't take much in your account to more than balance out that difference. Plus, with so many retail stores offering "cash back" when you use an ATM or debit card, it isn't too bad to get cash for free with Presidential. (And, most employers offer direct deposit, so we rarely use deposit-by-mail) One other trick is to find some local bank and open a no-minimum free checking account with them. You can deposit checks there and withdraw cash from their ATM, and use Presidential's free incoming ACH to electronically transfer the funds to Presidential when you get too much in the local account. (Chances are that these local accounts will not pay any interest)

Presidential offers a few features FirstIB does not. First, they scan all your deposits (not just your canceled checks) and offer to let you view them online. Secondly, they automatically send you an email when they've received a bank-by-mail deposit (you can opt out of that, of course). Third, they let you view *all* of your old statements online (FirstIB only lets you view the last 6 months worth, though both have the complete history available in the web interface).

Like FirstIB, Presidential uses DigitalInsight for the web interface and CashEdge for the ACH feature.

Internet Bank Comparison Both banks offer two primary checking and two primary savings accounts. This table compares all four accounts.

FeatureCheckingSavings
FirstIBPresidentialFirstIBPresidential
AccountInterestFreeInternet PlusInternetMoney MktRegPremierInternet
APY1.26%0%4.50%1.25%3.90%2.75%5.25%1.50-2.50%
Min Balance$500$0$1000$500$4000$1000$0$100
Min MethodAvg DailyDailyAvg DailyDaily
Min to Open$100$25$1500$500$100$100$5000$100
Online Bill Pay$0$4.95/mo$0$5.95/mo (up to 10 payments)Not available
ATM surchage rebate$6$0$0$0
Online Canceled Check ImagesFreeN/A
Online Deposit ImagesNoFreeNoFree
Email Deposit NotificationsNoFreeNoFree
Email Balance AlertsFree
ACH policyFree when initiated at other end; free for deposits when initiated at the bank's website; small fee for withdrawals when initiated at bank's website




Online options at traditional banks

As I mentioned, Presidential is a traditional bank that has offered Internet accounts for some time now. Some other traditional banks do that as well, but look carefully at them, especially if they're a big nationwide bank.

I looked at options from Bank of America, Citibank, HSBC USA, ETrade Bank, and some other large national banks. In general, these were just regular bank accounts with a web interface (HSBC and ETrade being somewhat of exceptions). Read the fine print and you'll find poor interest rates and traditional banking hassles (lack of immediacy in transactions, predatory posting practices, fees for just about everything).

Plus I can tell from personal experience that Bank of America customer service is downright awful. One example: they used to pressure me to open a credit card with them every time I visited a branch. I declined every time, and had to be forceful about it sometimes. Then one day, not long after a visit with a particularly annoying employee, I got a Bank of America credit card -- which I had specifically refused -- in the mail. Hmmmmmm.

They also, unlike any other bank I've ever used, charged a fee to receive wire transfers. Which they had not disclosed. Then, of course, they apply withdrawals before deposits, and caused an overdraft from the undisclosed fee, even though the wire transfer was a *deposit* that would have very easily covered it. That was the last straw for me. After nearly an hour on the phone with them, I got them to agree to reverse both the fee and the overdraft charge. THEN I said I wanted to close my account.

When traditional banks are better

There are a few situations where Internet banks don't work out so well:

Living paycheck to paycheck. If you are constantly running out of money, and it is vital that any deposits get posted *NOW*, you probably don't want an Internet bank (and wouldn't benefit from their higher interest anyway). When you mail in a check, it can take some time before you have access to the money; the postal service has to deliver it, and then most of your checks will be subject to the "non-local" availability policy and held for about a week (though interest will start to accrue immediately).

Not being able to meet minimum balances. Internet banks do tend to offer a variety of accounts, but if you aren't able to regularly maintain $500 to $1000 in your checking account or $2000 to $5000 in your savings, you will likely fall into a category with lower interest and fewer free benefits. These accounts may not have many benefits over traditional banks, and may in fact be worse in some ways. The interest rate gap won't make that much of a difference either. There are several online banks that offer savings accounts only, with high interest rates and low minimums, that you may want to investigate -- though you will have availability into your checking account measured in days, not seconds. Presidential's savings has no minimum, but requires $5000 to open. (They do have lower-interest savings accounts without that requirement) Watch out for FirstIB's accounts; you can open an account with less than the monthly minimum balance, and will run into fees unless you deposit enough to meet the minimums.

Next.