You are currently browsing the category archive for the ‘Computers’ category.

Presto: those who use the expression “there’s no such thing as a free lunch” to mean, straight off, that you can’t get something for nothing are missing something. The expression is American 19c, and it refers to the standing institution of the “free lunch counter”: a place where workingmen of a particular stripe could receive a rather sumptuous repast — or more often a sandwich — for the “price” of a drink.

The sense in which the free-lunch counter was not free is not that of the inexpensive alcohol served to accompany the meal: it refers, rather, to the political patronage system one had to enter to engage in such a pastime: that of the Democratic “machine”. The existence of such a peculiar institution was offensive to many for divers reasons, but although the standing principle incarnated in the adage’s modern variant is sounder not learning the lessons of the American past is a treat reserved for the happy few.

What is not a treat reserved for the happy few is inexpensive and well-designed software, and if the people think (as I once did) that Microsoft products are overpriced this is partially because they do not remember other “sectors” of commercial computing such as CAD programs that cost $10,000 a copy, and the competing products which would teach us the true virtues of, e.g., Microsoft Word — as opposed to its reliability and “ease of use” — are not around, perhaps on account of the Department of Justice.

“Open-source” software teaches valuable lessons about computing, such as its historical structure (earlier software cleaner and simpler and more likely to be available today) and the reliance of the “cooperative commonwealth” on the good will of programmers and other IT professionals, including those for rather powerful and “cutting-edge” companies. Sun is no “greenhorn”, and there are features in OpenOffice which they may not ultimately “know what they do”, at least yet.

If it is really worth doing quickly, precisely, and worldwide — such that the digital computer is an essential tool for accomplishing your task — open-source products incorporated in the free POSIX systems available for fifteen years are a good deal: however, if what is being done is more or less a “reflection” of the ends of a particular “man”, Microsoft products are as I said six years ago very good indeed: if one must “smash the mold” at any price, I guess the very best technology and the very least access “under the hood” would be all right.

Material culled from a series of emails sent on June 17, based on work done a year ago:

It is rather obvious that model theory was invented in the fabulous ’30s, by the man who later coined the term “theory of models” and wrote further works on the topic philosophers don’t read since it “wouldn’t do”. I think the pudding-proof is that the Completeness Theorem — of which Loewenheim-Skolem (a deliciously Teutonic hyphenated name) is a consequence when you really understand what it says — is not really a model-theoretic result.

The model-theoretic idiom in which the completeness theorem was cast (what’s his name and what was he about?) partially conceals that it’s just a consequence of the structure of the first-order proof system; if you doubt this, consider Kalmar’s proof for propositional completeness. We need model theory for limitative results (!) like non-arithmeticality, incompleteness, and maybe even the very unheimlich undecidability theorem (oh, if we cast the problem in that language of “productions” that mysteriously appeared at some point, I guess not).

Recent results suggest a vast preponderance of oracles make P unequal to NP; You Know This. What does this finding in “empirical mathematics” mean? I thought it meant this: the P and NP problem can be solved negatively not only because of the obvious observation about a proposition of n literals having n^2 combinations of stipulatively independent truth-values, but also BECAUSE THE CONTINUUM HYPOTHESIS IS UNSOLVABLE. “Sometimes you can put a 2^aleph-null set (the oracularly solvable SAT) and an aleph-null set in one-one correspondence, but most of the time you can’t.”

What could that even mean? That you are confused when you think you can do this, like people who believe in the Rebus of Picardy ‘cuz they read about it in Vico — or just ‘cuz. The “reducible” set of real cardinality, “solvable SAT”, is fake: if P=NP you already know the answers regarding satisfiability you would otherwise have to work out nondeterministically like any old validity testing in propositional logic, you just don’t know that you know them for whatever reason (including, I suppose, that it is lucrative to be confuséd).

To be crystal-clear about something you already understand better than me, what I am saying is this: it is a reductio that P cannot equal NP because it would mean the end of logic and the end of the world; strictly speaking, the oracles that permit P=NP reduce the level of propositional complexity by one order to the cardinality 2^aleph-null-minus-one, which it is well known is equipollent to aleph-null. This is a husteron proteron. Furthermore, that people who float the idea that 99% of oracles are in the first class and one percent are in the second are in total possession of the true meaning of what they say is dubious: they could be schizophrenic, or in love.

UPDATE 6/29/09-6/30/09: If people are still “curious”, let me give my reply to the objection to P != NP from “oracles”. For those that don’t know, an oracle in computation/recursion theory is a database of computations (states of a Turing machine, or instructions for another model of computation) that a regular computer can “query” and use to determine its next state. Although “oracle machines” can solve problems that unrelativized TMs cannot, like the Halting problem, they stricto sensu do not change the complexity of a problem.

How do oracles touch the question of SAT? Like this: an oracle for a satisfiability-prover is essentially equivalent to quantifiers for a first-order language built upon the sentences of propositional logic being quantified over; a clue as to this is given in the first edition of Hopcroft and Ullman, where the oracle used to “prove” P=NP contains the set of all true quantified Boolean sentences. How can first-order logic and propositional logic interact? Well, the Henkin-style proofs of completeness for first-order logic involve “scapegoat” theories, where every sentence involving quantification is reduced to an existentially quantified sentence and then cashed out in terms of the propositional sentences that satisfy it.

In categorial terms, an oracle is a “subobject classifier” mapping an n-tuple onto a truth-value; as previously mentioned here, adding subobject classifiers to a cartesian closed category (the categorial model of computability) creates an elementary topos capable of modeling first-order logic. What’s the catch for SAT? Firstly, the set of true sentences of predicate logic is recursively enumerable, not recursive: an algorithm can “eventually” produce all its truths, but no algorithm can discover all the sentences invalid in predicate logic.

Secondly, an oracle that can be applied to SAT to make P=NP come out true is, as mentioned above, a device that reduces the complexity of the proposition being considered by one order — putting it within (well within) the class of transfinite numbers with a cardinality equal to that of the natural numbers. I find the most intuitive way to think about this is of an oracle containing the truth-value of every “literal” in the proposition under consideration, which adds just enough “randomness” for the proof procedure to be deterministic and in P from there on out; you could also think of an oracle machine, using quantified formulae, which solved the validity of the SAT sentence’s negation.

Thirdly, as Löwenheim-Skolem (an essential part of Lindström’s Theorem) shows any set of sentences that has a model — that is, an interpretation where all theorems are represented — has a countable model, i.e. one whose elements are recursively enumerable like the natural numbers they correspond to; “richer” models of computation have a level of complexity which is, well, too complex. At any rate, oracles are not enough to override our overwhelming impression that P cannot equal NP; as for “natural proofs” and other recent developments, I like functional and complete propositional logic better than, well…

Now, for a piece of philosophy of mathematics. It’s hard to see how you could take a “sideways-on” approach to mathematics: it is true if anything is, right? Well, maybe we can stretch that out a little further than what is involved in the usual Third Realm calisthenics. A piece of mathematics is something that will be true for all time: mathematical insight means seeing into the ages, what was and ever will be. Conversely, lack of mathematical acuity — our inability to solve a problem — is caused by the deceptions of the age, our follies and delusions.

As regards my own very modest mathematical output, I have adduced considerations that P cannot equal NP in “A Cool Million (Some Thoughts on P and NP)”, “Further Thoughts on P and NP“, and “Why P Cannot Equal NP“. They do not have the form of a formal proof and have failed to convince the contemporary complexity community (as a whole), but it is my honest conviction that the very simple logical difficulties associated with reducing the complexity of DEXPTIME-hard NP-complete problems are insuperable. The considerations are “dumb”, but honestly limitative results always are — the Incompleteness Theorem is a piece of simple trickery that would hold no interest were it not true.

Really, I think the hope that a brand new algorithm will crack the problem is a pipe dream; the essential nondeterminacy of the Satisfiability problem is just untouchable. Why does it appeal, then? Because computer science is about technical control, and the temptation of a computational Eden where all cryptographic algorithms can be cracked and mathematical proofs grow on trees is just too strong. (The harder-headed claim that problems from oracles — which are rather moldy, as they can be read about in the original edition of Hopcroft and Ullman — and other considerations may make the P and NP question impossible to solve, but it may just be our own damn fault for not defining our terms clearly. Always hard to tell.)

A scary story for Halloween: is it not in fact rather probable that the version of the patented RSA public-key cryptography algorithm available in PGP — which stands for “Pretty Good Privacy” — and SSL has a trapdoor for US security agencies? Microsoft Windows and other programs have been found to contain trapdoors for the National Security Agency, and there has long been speculation that the non-patented DES crypto-algorithm essentially contains one in the form of its 56-bit key length limit. Really, perhaps the “cyber-libertarians” of yore were a bit naive in thinking the agencies would let communications get away from them to that extent. (Even when I was more “privacy-minded”, I never used PGP for the reason that simply sending encrypted messages would draw attention to me.) Although in this contemporary world of baring our um, souls and credit-card numbers to all and sundry it may seem passe to think about this, perhaps even Internet businesses aren’t operating on the “rootlevel” they think they are. Or are they?

Recently, I’ve been experimenting with the potential of Usenet to serve a constructive role in the Internet present. Here’s a recent Usenet post which sparked a conversation on the possibilities of the Portland metro area (edited to remove the manual formatting forced by Google Groups’ underpowered editor):

Newsgroups: or.general, pdx.general
From: jeffrubard@gmail.com
Date: Mon, 20 Oct 2008 10:53:16 -0700 (PDT)
Local: Mon, Oct 20 2008 10:53 am
Subject: The Changing Face of Portland

Sorry to break in here, but I thought the readers of these groups might like to hear something from someone who is certifiably insane, yet not a “Living Christ” (although please note that *Christ* is actually a word for “Christian”, not J.C. himself, in some European languages). Acute observers of the Portland scene and all-state dynamics may have noticed a shift in people coming to visit and coming to stay in the Portland metro area. Like previous immigrants and visitors to the state, these people are better-educated than the American norm and somewhat well-heeled: however, they do not hail from the “hip ghettoes” of New York and the Bay Area, or the “Inland Empire” of the Rocky Mountain states and the Old Northwest, but from other points within and without the United States.

This dynamic has a multitude of causes, but on any analysis it would be unwise for those admirous of older Northwestern norms to ignore it. Portland has always been a big city, and it has an excellent chance to become a “bigger” one: a recognized international center of economic and cultural innovation. Part of this means breaking with a local tradition and accepting the role of “bridge-and-tunnelers”, as well as the rest of the state, as part of Portland’s “charm”; part of it means the aggressive pursuit of possible area advantages on the part of both public and private institutions. The reputation of an era in state government as being akin to a “Western Mississippi”, paying for much less than other areas deem necessary in infrastructural upkeep and exploratory projects, will do no favors to an area influential people elsewhere are currently very well-disposed towards. If we are to collectively move forward, these false economies must stop. 

Otherwise, Portland will acquire, perhaps re-acquire, a reputation most Oregon residents have never experienced it as having: the status of an “also-ran”, a place that had a chance to show the way for the nation and the world and didn’t make it. Then there will really be “new wine in old bottles”, and Oregonians will collectively be sorry. 

Last year the shy, if not retiring, Infinite Thought put her readers onto the new “mininotebook” or “ultranotebook” computers. Previously laptops that were significantly lighter or smaller than the norm had been limited to glorified calculators from HP and Tandy, then slow and expensive models from Sony and Apple: the digital plebs were limited to unrequited longing. However, beginning with the ASUS Eee a series of small and inexpensive laptops have hit the market; costing as little as $300, these are well within the reach of anyone who could afford to plunk down dough for a full-featured MP3 player. Many models feature ingeniously tweaked versions of Linux as their operating system; which, like the MIPS-derived Loongson architecture for domestic Chinese computers, is an interesting development in global computing. However, I have my eye on a Windows XP version of the Acer Aspire One which features a larger screen and a magnetic hard drive.

Since many public spaces have gained (or really, retained) free Wi-Fi I predict these devices will be “game-changers” for the real “digital divide”, that between economic haves and have-nots. As for me, I’m at the stage I was when I was a child reading BYTE in lieu of having an actual computer: my complicated Designs for Living involve me actually having the money at the present, but not an environment in which I can confidently predict my ultranote will remain unstolen. If only… (Note for Portland readers: computers are cheap today, but if your budget is really tight please do consider volunteering at Free Geek and getting a free desktop, as well as the chance to hang out with some cool people.)

Question of the day: why are we still using a fifteen-year-old format for digital audio?

UPDATE: With broadband speeds around the world increasing precipitously, and the arrival of affordable portable digital players capable of holding much more audio than most people can comfortably acquire and organize, perhaps it’s time to go “back to the future” and make songs available in an uncompressed (and therefore “lossless”) format like WAV. If people are being asked to pay CD prices, they should get CD-quality audio.

In recent comments, I was talking to Union Street‘s Andrew about a longtime hobby-horse of mine, “theoretical republicanism”. A lot of signification is based on composing a view of the world that you want people to find compelling, maybe even over and above your proper purchase on them as a political subject: this has its place, but if equalitarian political structures are to work there has to be a certain structural equalitarianism in the field of discourse. It seems to me that one way this can come about is through what we might call “civic networking”: mapping signification onto a geographical plat. Here in Oregon we have a blog aggregator, ORBlogs, that tracks a wide variety of state blogs: maybe it’s not exactly the best that is thought and said in the world, but it gives an accurate impression of the “lay of the land” for this state, which is a real sociopolitical entity.

Perhaps this sort of mapping could be intentionally cultivated at local levels: although you don’t always want your neighbors to know exactly what you’re up to, there are real political relations that computer-mediated communication could enable rather than suppress and a civic culture of “connectivity” might be worth having as a way to flesh out the real potentials of the place where you live. I don’t know how this would be done: I’m not sure “mini-Minitels” for cities would be a worthwhile investment as against independent initiatives, but there are limits to how much market-directed recomposition of interests and social formations would be desirable. You live somewhere, and your maxims and opinions find their role within that social space; maybe it’s important to have a way to have a relatively accurate understanding of what such role you are playing.

Okay, it’s time to do some caring and sharing. I recently joined the elite group of people permanently banned from Adam Kotsko’s “The Weblog” (my first blog ban but not my first banning, though I guess I can always hope it’s the last). I had been trying to achieve this end for some time; as previously mentioned, I ran into foul weather with one of the Weblog regulars (hard for me to understand, since to me she resembled nothing so much as me at an earlier age) and being repeatedly taken for a jackass by her and other commenters caused me a lot of anxiety. Adam is actually a talented and widely-read guy (though his failure to understand logical concepts is a little embarassingly ’90s, and his neither-fish-nor-flesh leftism a distressing symptom of the W-Zeit): I’m not really interested in theology, but I at least used to be interested in Continental philosophy, and as a member of the lay public who has a fairly extensive familiarity with the intellectual and political topics under discussion I am someone they would supposedly be interested in talking to, or at least humoring regarding my interest in their work.

This isn’t really how it is: an occasion for bringing up my banning was some comments by another “Jeff” which were relatively ill-informed but really quite in keeping with the natural dynamics of safe and sane computer-mediated communication. By contrast, the “blog elites” seem to view blogging as a kind of contact sport: and having done totally outrageous things in this vein myself before blogs existed, it makes me sad to see such a lack of consideration and lack of foresight regarding what can really happen. I really regret the harassment I felt forced to do in the past, and I think people who think of verbal abuse as some trivial matter really are missing a couple shades from the psychic spectrum. Since I’ve been declared Homo sacer (Adam nicely added a disclaimer after I called it a threat in email) I plan to stay away, and I hope people can see what I really think based on that.

UPDATE: Another one down — Unfogged, which featured a “comical” link to here. Noted sore loser and belletrist Ben Wolfson decided I had to go after revealing that Kieran Healy was one of their thousands of readers. I was a little verklempt, so it took two tries to get “All you had to do was ask, Ben” right, but I did. Not sure whether that was, qua signification, gone or real ill.

Like everybody else, I recently upgraded to Firefox 3.0. I am very impressed by most of the design choices, and more than impressed by the improvement in image quality: 3.0 has changed the way the Internet looks overnight, and if you’re attempting to rock inadequately anti-aliased images it now shows. As part of the upgrade I fished around in the add-on bin, and came up with the Sage RSS reader — “a lightweight RSS and Atom feed reader extension for Mozilla Firefox”. Though it’s a little primitive in some ways, Sage is an idea whose time has come: standalone RSS readers are dead but Web feed readers still give you all substance and no style, and that’s not the Web of the now (now that people with serious graphic design chops are getting in on the game).

If you turn off the feed display pane, the Sage sidebar is a very useful tool for collating Mozilla’s “live bookmarks”: as with Google Reader you can keep track of what you’ve already looked at, and nest blog (newspaper, library, etc.) feeds within subcategories of categories, but clicking on post titles in the feed summary will take you directly to the post in question — fully styled and with comments. Given the capacities of contemporary broadband, this only makes sense: there’s really no good reason not to use full HTML except that typing in blog addresses takes too much time and bookmarks can’t tell you “Click me, there’s new content”. Granted, you might have to look at some “creative” designs, and you might overinflate someone’s stats (though the latter seems to be the wave of the future in any case), but using Sage as your reader is definitely worth a try.