Ever since the open-source rebranding in 1998, I’ve been telling people that “open source” should not be capitalized because it’s an engineering term of art, and that we would have achieved victory when the superiority of (uncapitalized) open source seeped into popular culture as a taken-for-granted background assumption.
There’s a thriller writer named Brad Thor who I never heard of until he publicly offered to buy George Zimmerman any weapon he likes as a replacement for the pistol the police impounded after the Trayvon Marin shooting. What Thor was really protesting, it seems, was the fact that Zimmerman didn’t get his pistol back when he was acquitted; instead, the federal Justice Department has impounded it while they look into trumping up civil-rights charges against Zimmerman.
This made me curious. The books are pretty routine airport-novel stuff, full of exotic locations and skulduggery and firefights. Like a lot of the genre, they have a substantial component of equipment porn – lovingly detailed descriptions of weapons and espionage devices.
Amidst all this equipment porn the characters casually use “open source” (specifically of encryption software) as a way of conveying that it’s the best available. And the author writes as though he expects his readers to understand this.
Victory is sweet.
There are reasons for using open source crypto software, and indeed compiling it oneself on one’s own computer, that do not apply for open source word processors, open source spreadsheets, and open source desktops.
Of course, even open source crypto software cannot be trusted, even if you compile it yourself, unless you totally bootstrapped the entire computer from scratch. Ken Thompson’s 1984 paper “Reflections on Trusting Trust” points out that it’s possible to put a back door into a compiler that not only pollutes the operating environment, but also detects when the user is attempting to compile a clean copy of the compiler itself, and re-inserts the back door into the new copy.
Of course, even the sloppiest open source software is more trustworthy than the liberal media scumbags who canonized Trayvon Martin.
Ok… i feel completely unobservant. I hadn’t even noticed an “Open Source” vs “open source” debate. I’ll make sure i use the appropriate capitalisation (i can’t remember off the top of my head whether i use capitals or not but i fear the former would be instinctive for me).
>I hadn’t even noticed an “Open Source” vs “open source” debate.
In the original call to the community I wrote>
There’s no “debate”, just people succumbing to a tendency to overcapitalize common nouns as though they were writing in German.
I must confess to a tendency to capitalize the word “open source” myself. I subconsciously associate the phrase to the Open Source Initiative and therefore I assumed it required to be capitalized as an official term and like a proper noun would be.
Under what contexts would it be proper to capitalize the word “open source”?
>Under what contexts would it be proper to capitalize the word “open source”?
When it is used as part of a proper-noun phrase, such as “Open Source Initiative” or “Open Source Definition”.
@IGnatius T Foobar
“unless you totally bootstrapped the entire computer from scratch. Ken Thompson’s 1984 paper “Reflections on Trusting Trust” points out that it’s possible to put a back door into a compiler that not only pollutes the operating environment, but also detects when the user is attempting to compile a clean copy of the compiler itself, and re-inserts the back door into the new copy.”
That has been solved by David A. Wheeler
http://www.dwheeler.com/trusting-trust/
“David A. Wheeler’s Page on Fully Countering Trusting Trust through Diverse Double-Compiling (DDC) – Countering Trojan Horse attacks on Compilers”
(off-topic)
ESR
To quote one of your posts:
I’m disturbed by the fact that even though I’ve been thinking about the matter for years, I haven’t found any Schelling points in the theory of IP rights that look really stable. This suggests that the IP abolitionists may win the argument in the end. That could be very bad, because there are important kinds of creative work I don’t see how to fund without IP rights that allow creaters to capture positive externalities. I’m not worried about software, because that can be funded from its value as an intermediate good; I’m worried about music and novels and artistic goods with economics like those.
Have you read a book titled “Against Intellectual Monopoly”? It systematically demolishes the utilitarian Pro-IP arguments typified by the sentence “That could be very bad, because there are important kinds of creative work I don’t see how to fund without IP rights that allow creaters to capture positive externalities.
Dead Tree Edition
PDF version
— Foo Quuxman
p.s. oh, um, sorry for the epic-level brain fart in the other thread. Not sure how I forgot that one.
To anyone looking for ESR’s post that I quoted it is here: http://esr.ibiblio.org/?p=1337
I meant to put that link in the first post.
— Foo Quuxman
JAD: “There are reasons for using open source crypto software, and indeed compiling it oneself on one’s own computer, that do not apply for open source word processors, open source spreadsheets, and open source desktops.”
There are those who think the NSA is out to hoover up every scrap of data they can get their grubby little paws on, no matter how trivial (of which I am not one), would beg to differ.
OTOH, when was the last time someone conducted a full source code security audit on the typical Linux desktop and application suite? Has it ever been done?
In case you haven’t bumped into it yet, gitian (https://github.com/devrandom/gitian-builder) does reproducible builds, which means they can be usefully signed.
Disclaimer: I’m not a dev on the project, I just think it’s a Good Idea.
Why are you not one of those people when NSA director Gen. Keith Alexander basically made this the mission of the NSA?
Because I refuse to indulge in that level of paranoia.
The set of people in the world qualified to participate in such a project, all working around the clock, would not make up 1% of the manpower necessary to bring the project to completion in less time than the half-life of the code subject to audit.
>There are those who think the NSA is out to hoover up every scrap of data-
I kind of like to think the Feds are imitating Lem’s Information Pirate. As John D MacDonald’s Travis McGee said, it’s prudent to feed them lots extra!
“The set of people in the world qualified to participate in such a project, all working around the clock, would not make up 1% of the manpower necessary to bring the project to completion in less time than the half-life of the code subject to audit.”
Yeah, that’s my thinking too.
In that environment, how much value does the openness of the code add when considering the likelihood of backdoors or their absence?
How did|do the OpenBSD people get their audits done then? Do they have a (near) monopoly on the people qualified to audit such code?
OpenBSD only audits the kernel, core userland, and I think X11 and selected servers like Apache.
They could give two shits about the desktop and crap like OpenOffice.
The fact that a lot of this garbage is often written in C++ doesn’t really help it from an auditing perspective. If one could hypothesize an OpenOffice written in Ada, that might be more amenable to auditing and analysis. But fat chance of that ever happening.
But if they were really writing as though in German, they’d have to say things like “Opensourcecryptosoftware”, “Opensourceofficesoftwaresuitewordprocessor”, or “Opensourceglobalpositioningsystemstandardizedinterfacedaemon”.
To me, “open source” is a generic description, but “Open Source” is something that fits the OSI definition.
>To me, “open source” is a generic description, but “Open Source” is something that fits the OSI definition.
Please don’t do that. The phrase “open source” is meant to be both a generic description and point to the Open Source Definition. In this it is like many other terms of art in engineering; the phrase is not a proper noun, and thus should not be capitalized, but it is defined by a technical standard which is. Compare, for example, “token-ring network”.
Compare, for example, “truth, justice, and the American way” or “my glasses were thicker than the bottoms of glass Coke bottles”.
Your fixation on the phrase not being a proper noun suggests that you’ve never heard of a proper adjective. If the phrase is being used specifically to outline compliance with the Open Source Definition, then it falls squarely under “adjectives formed from a proper nouns”.
It’s never WRONG to NOT capitalize the term, if the intent is to use the more generic sense, rather than emphasize the specific proper-noun Open Source Initiative. But it’s also not wrong to capitalize it IF it’s specifically referring to the OSI or some other organized entity with “Open Source” in its name. So the only way that “open[ -]source” becomes predominantly non-capitalized is for the generic use of the term to become so entrenched in the language that the OS-branded orgs themselves have become irrelevant, so that there is no need to refer specifically to the proper noun.
@ Monster
I gather that you are suggesting that “American” is a proper adjective. I don’t know what you were getting at with the second example; “Coke” is a noun (although, interestingly, I have heard that in the South, “Coke” is a generic word for pop or soda, such as Pepsi).
Your first example made me think of “Secret laws, secret prisons and the American way” which leads us to a classic proper adjective: “Orwellian”.
@ Jay
Paranoids can have enemies.
@ Jay
What I should have said was:
Just because a person is concerned about being monitored and information about him being recorded doesn’t make him clinically paranoid.
@ Jay
Which is why Google Glass makes my skin crawl. Why do think the terms of use say that you can’t lend them to a friend? Google wants to know who they are collecting information about.
Now, Google isn’t connected to the NSA. And Lockheed Martin isn’t connected to the DoD. They are just suppliers.
What really scares me is that in Canada, surveillance cameras go up at practically every intersection with traffic lights and it seems that no one even comments about it. The problem with Canada is that there has never been a really good reason for people to worry about the government. Or so almost everyone seems to believe.
Ok – sorry – cosmically off-topic
Eric, you’ve never seen the movie AntiTrust? It involves a Microsoft-expy company developing a network service called Synapse (this was in the days when there was a lot of fear and distrust surrounding .NET[0] and Trusted Computing); early in the film a hacker asks the Gates-alike company president via instant message: “Why don’t you make Synapse open source?”
It was perhaps the first major Hollywood production to mention open-source software in a positive fashion, and even show people using Linux systems.
The movie itself was shit — basically a retread of Enemy of the State with Microsoft as the bad guys — but the screenwriter at least did a modicum of homework on hacker cultural values, which is more than I can say for many computer-themed films of the day.
[0] When .NET was publicly released in the early 2000s, it not only comprised the CLR and associated languages and tools, but also a suite of centralized network services — a reflection of Microsoft’s talent and penchant for creating a brand without being clear about what specific thing that brand is. Fear and loathing about users’ privacy was the result; these days we upload our whole lives into “the cloud” without batting an eyelash, giving the NSA plenty of rope to hang us with.
According to the Washington Post several NSA insiders have come out and stated that General Alexander has made it a goal of the agency to collect any and every internet communication it possibly can.
Belief in a surveillance state is not paranoia when you actually live in a surveillance state.
@BRM
Arguably, the word “Coke” in “Coke bottle” can be construed either as the first half of a compound noun or as an adjective modifying “bottle”. But by that reasoning, “Open-Source” in “Open-Source software” could be construed as the first part of a compound noun as well, and now we’re right back to whether it’s a proper noun or a common one.
ESR’s point seems to be that “open source” is losing its status as a proper noun just as “aspirin”, “kleenex”, “hoover”, and “xerox” have to varying degrees done so. And that’s fine. I part company with his insistence that proper adjectives should not be capitalized.
>I part company with his insistence that proper adjectives should not be capitalized.
No, we interpret the usage type of the phrase “open source” differently. I class it as a technical term of art, not a proper noun and thus not to be capitalized, which happens to be formally defined by a standard with a proper name (the Open Source Definition). There are plenty of precedents for this.
From the beginning I never wanted “open source” to be capitalized, because I never wanted it to look like someone’s exclusive brand or tied technique – just the thing programmers commonly do if they care about best practices.
@ Monster
Huh. I didn’t think of that.
In any case, I wasn’t really taking a poke at what you wrote; I was using it and Jay’s comment to refer to….
… the US has changed so much … after 9/11, I’d hear about politicians saying crap like “They hate our freedoms”…. and, justified mostly by 9/11….
I read the Washington Post article to which Jay linked. Nothing in it really surprised me, but after reading it, I feel almost like I am in shock. The fact that I read it was presumably recorded. I clicked the link to see more comments. That act was presumably recorded. What I write here is, being a blog, obviously recorded but presumably also scanned by NSA computers.
I can’t be bothered to do the standard comparison to highway traffic deaths, but the number of people killed by terrorists is… unfortunate but, really, almost trivial in relation to the changes in the USA that are justified on the basis of fighting terrorists.
I’m not an American, but speaking as if I were, the joke goes: “The constitution may not be perfect, but it sure beats what we have now.”
In light of how I feel after reading that article, it is a very sad joke. The US used to be the standard by which the ethics of countries were judged. Now… secret laws, secret courts, secret results, secret prisons. It is very, very sad.
“Checks and balances” was a fundamental concept in protecting the people from its government.
Suppose the US goes from being a surveillance state to being an out and out totalitarian state. The US now has in place a great system to defend itself against any person or group that objects.
Amidst all this equipment porn the characters casually use “open source” (specifically of encryption software) as a way of conveying that it’s the best available.
Well it IS fiction. Lol.
Some FOSS stuff is best of breed. Crypto maybe falls in that category, maybe not. Don’t do crypto stuff.
A lot of other stuff FOSS might be best bang for the buck in some cases but not close to best of breed if money was no object.
Even when zero cost FOSS sometimes costs more in terms of productivity loss vs license cost vs commercial products.
And when you’re using it without specific reference to the OSD, it shouldn’t be capitalized. It should only be capitalized when the writer is making the specific point that he’s referring to the OSD as opposed to, say, Microsoft’s creative interpretation of the term.
Again, I agree completely that people not capitalizing the words means that we’re winning. It means that people have largely rejected other definitions, and will use ” ‘open[ -]source’ ” with the scare quotes when talking about them, possibly preceded by “so called” (or even “soi-disant” if the writer feels the mood). I only disagree with the notion that it’s somehow wrong to capitalize it when it’s specifically talking about OSI or some other organization with OS in its name.
@Jeff Read
Another minor thing I noticed about AntiTrust was that it actually showed Linux on the screen [in the form of the then-current look of the gnome desktop environment] rather than “generic hollywood special effect operating system”, which I [at age 16] took as evidence that they knew their stuff.
http://www.theguardian.com/world/2013/jul/31/nsa-top-secret-program-online-data
How’s that sand feel around your head, Jay?
I agree that not a lot of fosstards understand that time is money, and the time that people lose dicking around with anything that is not Microsoft Office can more than pay for a copy of Office.
But crypto is one of the few areas where the advantages of open source far outweigh the costs.
However, the only way people en masse would be able to take advantage of, say, strong crypto for email is for Microsoft to build the crypto algorithm into Outlook.
Dão Gottwald [:dao] 2010-06-25 06:37:00 PDT
Created attachment 454024 [details] [diff] [review]
patch
Users can override this using userChrome.css if they absolutely want it. I don’t think the prefs are worth it.
From here:
https://bugzilla.mozilla.org/show_bug.cgi?id=574654
This is what I think epitomizes open source more than anything else.
3 years of comments followed about how editing userChrome.css was complex and easy to screw up and how the previous version worked well. Nope, Dão Gottwald owns that piece of firefox and if you think that a feature that was extremely important shouldn’t be broken then you can go screw yourself.
Open source is rule by the petty and insane.
And now for something completely different…
A Wired>/> article:
First Open Source Airplane Could Cost Just $15,000
15 grand? That’s less than a Leaf!
It’s time for me to take my thumb out my ass and get my private pilot’s license, like I’ve always dreamed of ever since booting up the only Microsoft software I still respect — Flight Simulator…
This has been a weird day…
I just noticed that, in addition to screwing up the close-italics tag, what I posted didn’t make the point I was trying to make.
I originally saw this on CNN and the article began:
and it had a link to the original article on Wired, which began:
CNN felt the need to add dash in “open-source” but in both cases, they used “open source” in lower case letters and felt no need to explain the term other than to say that the plans would be free. Furthermore, the article didn’t define “crowdsourced”.
There was also a paragraph that included:
In a way, this is even sweeter than ESR’s example of reference to open source software because – here it is – used as a “technical term of art” in relation to something that pilots and engineers are doing.
>In a way, this is even sweeter than ESR’s example of reference to open source software because – here it is – used as a “technical term of art” in relation to something that pilots and engineers are doing.
Indeed. I consider it a feature, not a bug, that the compound adjective in “open-source software” can be and has been generalized to refer to hardware and unspecified other components that have freely reusable and modifiable designs. The maximum win is for engineers of all kinds to come to think of open-source designs as routine best practice.
@ ESR
You might recall an article that I sent to you in 1999 or 2000 for your comments called “The Generation Gap – open-source components in closed-source applications”. Apparently, I didn’t make myself clear in the article because in your reply, you said that you didn’t know much about “components”, when I just meant… you know… software parts, functions, libraries.
This was when I first became enraptured by the Internet. I had also written an article in 1999 called “The Cash and the Calling – the potential of open-source AI parts”. The title was a direct steal/metaphor of/on “The Cathedral and the Bazaar”.
I was paid for the second article, which was published in the Linux Journal (although at the last moment, they decided to only put it online, not in the paper magazine) and I would have been paid for the first one, but they wanted “First Serial Rights” and I had already “published” it on my website, so it was published in the Linux Gazette. (They are both available on my website now, if anyone cares.)
ANYWAY…
After the second article was published, I got an email from a guy saying that software was hardly the first area in which participants were deeply into sharing and he referred to the ham radio culture. I sent him a reply, telling him my (past) ham radio call sign and the fact that I had the 1972 edition of the Radio Amateur’s Handbook on a book shelf three feet behind my chair and that of course he was right.
The ham radio culture is/was very much like the open source software culture and, now, the culture associated with the open source airplane project.
SO…
The phrase “open source” has become recognized, outside of the field of software, not only as a “technical term of art”, but as a cultural thing as well.
Hmmm… my comment is “awaiting moderation”. Way too much bragging, I guess.
@ ESR
Feel free to cut out all but the last two pargraphs, if you want.
@ ESR
Huh. You just used the word “components” to mean exactly what I meant in “The Generation Gap – open-source components in closed-source applications”.
Oh…. the article I quoted used the word “components”.
OK. No more serial commenting (for now).
@BRM
That’s because they used it as an adjective, preceding and modifying “airplane”.
This child is two feet tall.
He is standing by a two-foot-tall fence.
When I was younger, my beard was red. (Now it’s white.)
The red-bearded Viking leader swung his sword.
—
The same thing happens with “phrasal verbs”:
I plan to log in with my admin account and lock out users at 11 pm, then shut down the server for maintenance.
The automatic lock-out was triggered by excessive login attempts
The last shutdown was unexpected.
In some cases the two words become one; in others they are hyphenated.
Just remember that when they’re verbs, they’re two separate words. Login is not a verb.
>That’s because they used it as an adjective, preceding and modifying “airplane”.
And that is correct, as I pointed out at the end of my original call to the community.
I like the idea of the Maker Plane. As a CFI, I’m a little disappointed that nobody’s talking about how to fly the thing yet, though, or how to teach others to transition into it from other aircraft, or…
I want one. Perhaps one day I’ll be able to build one.
@ESR how much do you like it when it is used outside software? E.g. http://opensourceecology.org/gvcs.php
>@ESR how much do you like it when it is used outside software? E.g. http://opensourceecology.org/gvcs.php
That’s a good use. It appears from a skim of the website that they know what they’re talking about and are publishing under BSD or CC-equivalent licenses. I’ve seem other attempts at generalized uses that weren’t so happy coming from people with fewer clues.
The maker plane is but a step on the road to the maker flying car…funny that the maker jetpack hasn’t appeared by now.
Huh. Live and learn. I went to four different elementary schools (one in a different province and one, briefly, in New Zealand) at a time when various new ideas were being tried.
The result was that I got part of one class – maybe 40 minutes – of grammar and syntax. On the other hand, by the end of grade 6, I had read hundreds and hundreds of books. Occasionally, like now, I learn something new about the subject.
I remember one thing I learned (probably a few years later)…
A Texan was at Harvard for the first day. He stopped someone and asked, “Kin y’all tell me wheah the Labree’s at?”
He was told, “At Haavad, we do not end sentences with a preposition.”
The Texan replied, “Kin y’all tell me wheah the Labree’s at asshole?”
The local pronunciation is “Hahvid”. And you’d actually be looked down upon in Cambridge with such a townie accent; Americans attending Harvard or MIT tend to cultivate an accent similar to North American broadcast English.
Okay. And the Texan may not have used “y’all” when speaking to one person, but he might well have said “Laaabree’s” and pronounced “hole”, being at the end of a sentence, as two syllables.
I’m not following this.
Who pronouces it “Hahvid”?
When you say “in Cambridge”, do you mean people in Cambridge that don’t attend Harvard or MIT?
By “townie accent”, do you mean small-town (ie. rural) or what?
To my ear, “North American broadcast English” sounds like how people speak in Calgary and most of English-speaking Canada. Which raises the question: Is there a neutral accent (like a preferred frame-of-reference), or is it all relative?
@ Jeff Read
Lemme try to ask this another way…
Are you saying that “in Cambridge” – ie. people that attend Harvard and MIT – would look down upon the local small-town accent pronunciation of “Harvard” as “Hahvid”?
Working-class people from Boston proper or one of the outlying cities such as Everett, Revere, or Quincy. An almost-archaic exception is the Boston Brahmin accent, which sounds almost like a posh English accent.
Actually, Boston-area accents are highly complicated; and a trained ear can identify the town, neighborhood and possibly even the particular street the speaker grew up on from a speech sample. I can’t do this.
Many Cambridge denizens are alumni who have started or are working for businesses in the area, so looking down on the accent extends outside the universities.
No, I mean someone from “in town”, i.e., Boston. In general a townie is someone who is from the town or area a university is in, who is not a student. If you’re at Harvard or MIT and you hear a Boston accent, the odds are good that the speaker is a townie.
Strictly speaking, it’s all relative. However, American broadcast English (a.k.a. the General American accent) is considered by a majority of Americans to be “accentless”, and speakers don’t sound like they come from a particular place; hence its selection as the standard accent for national broadcast media. Regionwise it’s closest to the accent spoken midcontinent, particularly the Omaha area; a lot of people from Omaha have made a tidy living as newscasters, radio announcers, and telephone operators for that reason.
There are a few differences between General American and General Canadian; in particular we pronounce words like “sorry” and “bike” with lower vowels than you do.
>There are a few differences between General American and General Canadian;
…but querent is generally correct that these accents are very little different. Notably, they’re much closer together than a lot of adjacent British dialects.
>There are a few differences between General American and General Canadian; in particular we pronounce words like “sorry” and “bike” with lower vowels than you do.
Don’t forget “about”. /ai/ is actually raised by a fair portion of Americans with accents close to GA (though not in the core GA territory itself), whereas /au/ is rarely raised south of the border.
Eric
Speaking of victory, here’s another vindication on the horizon for you:
http://www.indiegogo.com/projects/ubuntu-edge–39
I can’t remember the post where you made this prediction for the long term of the post-PC world, but you can pat yourself on the back for another one well made.
You do have to listen closely to distinguish General Canadian from General American, but it is possible. In particular, when you hear someone say “aboat” instead of “about”, you’re listening to a Canadian.
And don’t ever let a Canadian tell you they don’t say “eh?”, either. They do; they just don’t realize they’re doing it.
Hello everyone, I finally made it back. Nice to be able to regularly read all the quality posts and debate again, Mr. Raymond. I’ve missed this place.
All of the following is about dialect and accent, spurred my Mr. Read’s comments, and it’s longish, so please skip if not interested:
For those here who may be interested (I know some of you are fascinated with language and its related elements) WRT to the Boston accent and all of this – There is a “general” New England accent present in many areas (though not in all New England states, and not even across the entirety of a particular state itself – depends upon the area) that shares many common characteristics across dialects but they differ in fundamental ways. For example when and how the “r” may be dropped, how the “o” is accented, and so on. For example – my current girlfriend (who is Arabic) spends a lot of time listening to the way people talk. She doesn’t seriously study it but she is fascinated by it. She will tell you in general that I pronounce my “o” like “oa”, my “u” like “a” in the short form (so “ducking” might sound like “daking” – like the “a” “fracking” – oh yeah, geek points!) and of course I drop the “r”. These habits change depending on the structure of the word and sentence of course, but they give you the overall idea.
As Mr. Read points out, like Britain, the “townie” accent in Boston often differs (even if slightly) from ‘hood to ‘hood and street to street. An aware individual may indeed be able to tell the source of a person’s accent by hearing it and paying attention. Or possibly by the words they use. A hoagie type sandwich is an easy tell for example – could be called a hoagie, hero, submarine, grinder, spuckie, etc. And sadly, he is very right that those of us who speak this way are often looked down upon by the Cambridge people and other “elites”. It wasn’t always this way but for some reason this way of speaking has become associated with the blue-collared working class (I certainly come from that environment). I can’t say that I had this experience with the folks at MIT though. Maybe I just got lucky with my associations, but all they seemed to care about was your ideas and competence. I wonder if that’s due to the type of schooling MIT most often engages in.
Interestingly enough I read somewhere (don’t remember where so I can’t source it) that the general Boston accent is actually quite close to some older forms of English at its core and today pronunciations and terms that are used are very similar to that of Sussex (at least I think it was Sussex).
Fun fact – I have a pretty strong “eastie” accent growing up in Boston and Brockton, but today it is so bastardized as to be comical – in addition to learning to speak in that environment, I inherited a lot of the way I talk from both sets of grandparents who were active in raising me in my early years – on my mother’s side they were from East Bridgewater, Brockton, and Whitman, on my father’s side, Cape Cod (they lived in several towns along the coast so it’s just easier to reference the Cape). The grandparents on my mother’s side ended up moving to Brunswick, Maine (Navy base) and then Durham. I spent many happy summers and winter holidays with them and two of my uncles and I absorbed some of the eastern Maine dialect as well. Yes, I say “ayah” and “ayuh” – I’d love to get with an accomplished linguist or dialectician (is that a word?) one day and see if they can piece it all together.
If any ignorant yahoos are reading this, “querent” does NOT refer to sexual orientation.
Re: “querent”
I hate the possibility of spoiling something that I like, but…
I suspect that ESR has mixed feelings about my successful attempt to get folks on A&D to refer to me by my initials, but he doesn’t want to take a position here because it would have so much influence on everyone – he is letting people decide on their own.
On one hand, ESR became ESR from decades of working and dealing with the international hacker community. I did it here by inviting folks to do it.
On the other hand, if it was just ESR and RMS, I never would have tried to jump in that boat. But if it is ESR, RMS and JAD – well… that is a totally different story.
But the gripping hand is that I have been a weirdo and have had the hacker approach to the world ever since I could read beyond the “Dick and Jane” level… science, chemistry, electronics, ham radio, computer programming, mapping, data/databases, AI, placer mining…
JAD is a special case and in a way, I am a special case. Shall I write a poem about how weird it is to write a poem about how weird I am?
BRM
One of these days you are going to take the self-reference too far; causing you to become a thrall of Hofstatder. The GEB will appear on your right hand and on your forehead, you will speak to tortoises and you will attempt to turn the entire world into a fractaline fugue.
— Foo Quuxman
That I use this blog for the sake of my ego is not good for my ego, so I will stop that for the sake of my ego and for the sake of everyone else, which is good for my ego.
I will, in fact, attempt to strongly ease up on the self-indulgence.
And, of course, we are all special cases.
Everyone on Earth is a special case, although some are a lot more distinctive than others.
off topic: android market share increases to 79%. both blackberry and apple ios shares decrease.
Speaking of victory, here’s another vindication on the horizon for you:
http://www.indiegogo.com/projects/ubuntu-edge–39
Yah, not so much unless Shuttleworth ponies up $10m or so.
http://www.theverge.com/2013/8/7/4594714/canonical-ubuntu-edge-crowdfunding-campaign-may-not-reach-its-goal
Not a good looking funding curve.
There is damning with faint praise. Is there also damning with faint victories?
An obscure reference from an obscure author is a victory?
A likely failed kickstarter from the arguably the biggest Linux brand is a victory?
Meh.
Open Source victories are largely mainstream corporate like Android or iOS where it supplies infrastructure to support a closed source ecosystem.
On the plus side it means ESR was right and RMS wrong. But somehow even that victory strikes me as hollow given that the enabled app ecosystems are still all closed source. Which is still pretty much the only real way to monetize software.
$32 million was such a ludicrously high target (the most successful Kickstarter project ever, by comparison, raised just over $10 million) that everyone who ever thought this had a ghost of a chance at success was deluding themselves.
Which just goes to prove Nigel’s point: the best way to recoup the investment in time and resources it takes to develop software is to — wait for it — sell proprietary software.
Open source is winning the way Oceania is winning the war against Eastasia.
@ Nigel
Your first link doesn’t work.
@dtsund:
>$32 million was such a ludicrously high target (the most successful Kickstarter project ever, by comparison, raised just over $10 million) that everyone who ever thought this had a ghost of a chance at success was deluding themselves.
Part of the reason for the too-high target, I think, is that they massively overspec’d the thing. They don’t need laptop-level specs for what they’re doing (building a phone that presents a desktop-style interface with an existing application base when docked): A 10-year old computer with 256 megs of RAM and a 1 GHz single-core CPU can present a perfectly good desktop UI (though they don’t need to cut back quite that much).
No it can’t. Not today. In order for a desktop UI to be acceptable, it needs to run at a smooth 60 fps, with no flicker or tearing, double-buffered, fully hardware composited windows. That takes quite a bit of CPU power and memory. And we’re not even getting into applications yet; a modern browser won’t run a typical browsing workload with acceptable speed in anything less than 1 GiB RAM, minimum.
In short — yes, you need a laptop-class machine to do the things a laptop does.
Oh, and I forgot GPU power in the above. :)
Microsoft turns the screw, following the CatB power-relationship model exactly.
My objections to this model still stand, to wit: some proprietary vendors so thoroughly own a space that even with repeated lockdowns and price jackings they can prevent open source from gaining a toehold for the foreseeable future; Adobe enjoys this position with print design and Autodesk with drafting and 3D modelling. Microsoft appears to have a similar deathgrip on corporate enterprise software; there is simply no substitute for the Office suite, or its level of integration with back-office services like Exchange, IIS, and SharePoint. The question becomes, how long can they keep it?
I was happy when the San Jose Mercury News (newspaper of Silicon Valley) finally stopped using the phrase “so-called Open Source”. At least, I haven’t noticed it for a while, thank goodness!
@jeff read.
You are thoroughly confused if you think Autodesk owns 3D modelling. Maybe the very low end of the market. If any company could be said to own that space, it’s either PTC or Siemens PLM.
I find it interesting that the first distro of Linux to really nail it on the UI and garner this much attention in a long time, is such a blatant rip of Mac OS X that if Steve Jobs were still alive there’d be nothing left of it but a smoking patent-lawsuit impact crater.
Winning, indeed.
“I find it interesting that the first distro of Linux to really nail it on the UI and garner this much attention in a long time, is such a blatant rip of Mac OS X…”
This is a big win, *if* you’re into Mac OS X. I, for one, despise MacOS; it’s too glossy for me, and the “usability” is great if you like a lot of point-and-click work. But then, the first user interface I came to really appreciate was MS-DOS, but even when I appreciated it, I didn’t discover the “perfect” system until I discovered Linux several years later. And Linux had an ok-sortof UI as well. (To this day, I am convinced that *no one* currently comes close to doing UI right. I have some ideas to improve it, but I haven’t yet figured out how to find the time to perform the necessary experiments.)
In the midst of all this “Oh, the Mac OS X UI is so wonderful!” it should be kept in mind that it comes with this terrific interface–called a command prompt–that was extensively used at a place I worked at where Mac OS was very popular.
That, and the core that powers Mac OS X, although it is closed-source, is a blatant rip-off of the *code* of BSD Unix (albeit, one I don’t complain about, and even approve of somewhat, since the BSD license allows for such behavior). And if *that* isn’t a sign that open source is winning, I don’t know what is!
@jeff
“I find it interesting that the first distro of Linux to really nail it on the UI and garner this much attention in a long time, is such a blatant rip of Mac OS X”
The beauty is only skin deep. Without the Core APIs and Cocoa unifying the apps once you start running things it’s just plain old linux. The APIs are key to development of a consistent UX across OS and apps and that’s what makes the Mac very nice to use. That and the users demand UIs that don’t suck.
“That, and the core that powers Mac OS X, although it is closed-source, is a blatant rip-off of the *code* of BSD Unix (albeit, one I don’t complain about, and even approve of somewhat, since the BSD license allows for such behavior). And if *that* isn’t a sign that open source is winning, I don’t know what is!”
Open source wins by enabling closed source developers make a better than subsistence living.
RMS would vehemently disagree that is winning.
Very kind of you to not complain about BSD being used as intended even though you just did by calling it a “blatant rip-off”. In any case, the core that powers OSX isn’t the FreeBSD userland but the Cocoa and Core APIs largely initiated under NeXTStep and refined in OSX.
NeXT had a BSD userland anyway and I seen some folks say that some of the OSX userland is actually from NeXT vs FreeBSD.
“RMS would vehemently disagree that is winning.”
RMS also has a limited view on what freedom is. Indeed, I prefer BSD, MIT, and Apache-style licenses over the GPL precisely because you should have the freedom to do what you want with the source code, including changing it, compiling it, and distributing its binaries. There’s certain costs to doing so–when you go down that path, you essentially fork the code base, so you can’t benefit from other people being able to see your code, and you have to constantly watch the original code-base to watch for changes if you want to make sure that your core is kept up with the original–but if you really want to go down that path, go knock yourself out.
“Very kind of you to not complain about BSD being used as intended even though you just did by calling it a “blatant rip-off”.”
When I call it a “blatant rip-off”, I and doing it somewhat sarcastically, because Jeff is complaining about the eOS UI being a blatant ripoff of Mac OS X. My point is that Apple is doing such blatant ripoffs all the time–yet when someone else does it to them, they throw temper tantrums and then sulk about how the world is so unkind to them. I, for one, welcome “ripoffs”, because it means that ideas are growing and cross-pollinating, and that things are improving (even if the ripoff is done poorly–not that I’m accusing eOS of that–but I *would* accuse Windows of that, although I don’t want to go into the why’s at this moment).
The eOS UI is a skin on GNOME. So you have those APIs unifying the apps. It’s not as nice as Cocoa, but it is something.
No. Let’s get one thing straight: Apple pays for and licenses technologies that may be of use in their products. They licensed the Xerox Star GUI technology from Xerox; by contrast, Microsoft simply saw what Apple was doing and copied it.
Ryan Dahl, of Node.js fame, summed up the philosophy behind Mac OS quite nicely when he said “the only thing that matters in software is the experience of the user.”
Apple considered it okay to borrow technologies such as Mach and the BSD userland when building its OS. Those are low-level infrastructure of little consequence to the user; Mac OS X could switch to the NT kernel tomorrow, and it would still be OS X. End users wouldn’t know the difference, and developer impact would be minimal. (This is more a thing than you might think; consider Safari for Windows and the port of Rhapsody to Win32 from the 90s.)
However, if anyone copies and sells an Apple-like user experience, I don’t know about current Apple leadership but to Jobs that was an unforgivable crime. The UX is what makes Mac OS Mac OS, and iOS iOS, and if you attempt to duplicate that UX then you are not borrowing technologies to make a new product, but simply stealing Apple’s product and selling it as your own. In so doing you are asking for the full wrath of their attack lawyers to be brought down upon you.
“No. Let’s get one thing straight: Apple pays for and licenses technologies that may be of use in their products. They licensed the Xerox Star GUI technology from Xerox; by contrast, Microsoft simply saw what Apple was doing and copied it.”
As far as I am concerned, whether you license it or you outright copy it, it’s still copying. You aren’t the one creating something new. The funny thing is, though, that this is a good thing: you perpetuate good ideas, and in the process, you learn how to extend those ideas, correct mistakes, and overall make things better. (Except in the case of Microsoft; it seems that, whenever they copy something, they make things worse, more often than not. I particularly hate Microsoft’s tab-completion, for example.)
“Ryan Dahl, of Node.js fame, summed up the philosophy behind Mac OS quite nicely when he said “the only thing that matters in software is the experience of the user.”
“Apple considered it okay to borrow technologies such as Mach and the BSD userland when building its OS. Those are low-level infrastructure of little consequence to the user; Mac OS X could switch to the NT kernel tomorrow, and it would still be OS X.”
Of course it’s ok to borrow from those sources…but I find the claim that these are of “little consequence” to the user to be rather funny. The reason why Apple used this for the core, rather than OS 9, or whatever else they had before, was because their original core was flaky. It couldn’t handle multitasking without crashing, among other things. I’d imagine that if NT were considerably less flaky than Unix, they’d switch to that instead…because it’s all about the user experience!
“However, if anyone copies and sells an Apple-like user experience, I don’t know about current Apple leadership but to Jobs that was an unforgivable crime.”
Jobs was full of himself. He’s a nice designer, but he has some wacky ideas as to what is “usable”–and I, for one, find his idea of “usability” to be contrary to mine. His greatest crime was to mistake “ease of learning” with “ease of use”.
And while Jobs might think that copying is a crime, I seem to recall that when he challenged Microsoft in court for “stealing” the idea of a graphical user interface, he failed. And so have other companies, when they tried to protect keyboard shortcuts. And do you know what? I’m very glad they *did* fail! If they succeeded, we wouldn’t only be out of Microsoft Windows in all its incarnations (which wouldn’t necessarily be a loss), but KDE and Gnome, Enlightenment, Opie and GPE (in the days of Zaurus); perhaps even XMonad and Awesome and their kin would have been in mortal danger too. As much as I like the command line, I also like to use KDE for web browsing, and for containing all those command line windows!
Besides, it isn’t as though OS X is the first to have a nifty panel at the bottom, with a bunch of convenient icons: to me, this is suspiciously like Solaris’s CDE, except that it’s a lot more glossy. I never really liked CDE too much myself–but that’s more my preference–because I *did* think it was rather innovative.
“Besides, it isn’t as though OS X is the first to have a nifty panel at the bottom, with a bunch of convenient icons: to me, this is suspiciously like Solaris’s CDE, except that it’s a lot more glossy.”
No. It comes from NeXTSTEP and that was 1988. The craptastic dock from CDE came from HP VUE which was later but not by much.
Going with CDE was a huge mistake for Sun. They should have kept on with OpenStep.
“No. It comes from NeXTSTEP and that was 1988.”
Ok, I have just looked up screenshots for NeXTSTEP, OpenStep, and WindowMaker. They are kindof nice, but as far as I can see, they tend to put their panels on the right side, rather than centered on the bottom like CDE and Mac OS X. Not bad, but not necessarily earth-shattering, either. (At least, it doesn’t seem so to me, who’s GUI experience is limited to Windows 3.1 and beyond, just enough Mac OS < X and Gnome to know I don't like their decisions, KDE, OpenZaurus Opie and GPE, some CDE, a touch of Enlightenment, a touch of X Monad, and a couple of Android devices. If anything, I would consider the idea of having any sort of panel at all to be the important conceptual leap!)
In any case, these panel things are *still* very old; I, for one, am thankful they have been copied by pretty much everyone; and ridiculous Apple lawsuits notwithstanding, I am glad that eOS is copying them, even if I want nothing to do with Apple's silly interface (and am unlikely to use eOS in the future), because ideas are refined, built upon, and grow as they are copied.
Icons at the bottom date back at least as far as Windows 1.0, which reserved a space at the bottom of the screen for “minimized” applications. This was back before Windows supported arbitrarily positionable, overlapping windows — a feature Apple claimed IP rights over at the time.