Saturday, May 29, 2010

Rant: How U-verse [sm] Made My Life U-worse

Note: U-verse is a registered service mark of AT&T. As far as I can tell, U-worse is not (although perhaps it should be).

The full details of the U-verse disaster would exceed a reasonable length for a blog post, so I'll just vent over the highlights. Prior to my adventure with U-verse, I had phone service and cheap, slow (but fast enough for me) DSL service from my local telco (part of AT&T) and cable TV service from Comcast. The U-verse marketing people blitzed my part of East Lansing, MI last fall, first with repeated junk mailings (sometimes multiple times in a week), then with door-to-door sales people. I made the mistake of giving it a try. Things to watch out for when the come to your door:

There's no risk (30 day money-back guarantee).
Reality: I actually dealt with two different sales people (the reason unfolds below) and ended up placing two different orders, first for just TV service (I ended up canceling that order) and then for the TV/Internet/phone bundle. The guy who talked me into the first order said there was a 30 day money-back guarantee. The guy who sold me the bundle didn't mention it. Apparently not mentioning it took it off the table (?).
There's no installation fee.
This is true. There is, however, an early termination fee. (How that works in conjunction with the 30 day money-back guarantee is a bit unclear to me.) I terminated after one week (reasons unfold below) and got socked with a $165 early termination fee. Strangely, neither of the two sales reps I spoke to in my house mentioned this fee, nor did the woman who confirmed the service order over the phone. AT&T does not give you a written contract for U-verse, just a work order. (The contract is apparently oral.) The work order specifically mentions waiving any installation fee and says nothing about an early termination fee. That got me wondering about how a fee that is never mentioned ends up in an oral contract. According to a service rep I spoke to over the phone, they send you an e-mail message after the service is ordered and before it is activated. She said the message tells you about the early termination fee and says that activating the service makes you liable for the fee. I'll have to take her word for that, since I have no record of that message (I thought I kept all e-mail from U-verse, and the first message I have is several days after the date she said that one was sent.
You can keep your existing DSL service and just order U-verse TV service.
The final vote on that is 4-2 against. The first marketing rep said yes (so I ordered just TV). The first installation tech said no when he got to my house (so I canceled the order). Later on, a tech support person said yes (if you're keeping score, it's now 2-1 in favor of being able to retain your DSL service). Two other tech support people later said no (bring us to 3-2 against). The deciding vote, for me, was when the local telco sent someone out to hook up my Plain Old Telephone Service again. He said no. As best I can tell, if you get Internet service via dial-up, cable or dish, you can keep what you've got. That may also be true if you get DSL from a telephone company other than AT&T. If you get DSL from AT&T, though, then to get U-verse TV you apparently must get U-verse Internet service (or at least you must give up your existing DSL service). You can still keep your existing phone service, though.
You get your local channels with all U-verse TV packages.
Reality: You get most of them. You may get all of them. Here in East Lansing, we don't. U-verse includes over-the-air channels that are the primary broadcasts for each local station, but if a station puts out multiple broadcasts over what are called subchannels, you don't get those. Bottom line in the Lansing/East Lansing market: we don't get the CW network (carried by the local ABC affiliate on a subchannel) and we don't get two of the three program feeds from the local PBS affiliate. I don't know if we're missing any others.
U-verse picture quality is much better than cable picture quality.
This obviously depends on your cable provider, so Your Mileage May Vary. I do not pay for HD packages (from either Comcast or U-verse), so I can't tell you whose HD feed is better than whose out here. What I can tell you is that on over-the-air channels, you get a much better picture from Comcast. The reason is that most if not all of our local stations broadcast at multiple resolutions, with the higher resolution signals ranging from 480i to 1080i (no 1080p that I can find). Comcast's "digital starter" package, the one I had before the U-verse disaster (and have again), does not carry any cable channels in HD (you have to pay extra for that), but it does carry the higher resolution feeds from the local stations. U-verse provides only NTSC (pre-digital conversion resolution) on the local stations ... unless you pay extra for an HD package, of course. So if you associate resolution with picture quality (as in, you have an HD TV set), the picture quality for the local stations is clearly better with Comcast (plus you actually get the local stations -- see previous point).

A coda to all this: once I finally managed to get my U-verse contract canceled (an ordeal unto itself), I had to hump the U-verse equipment to the local UPS store to send it back -- except the uninterruptible power supply for the gateway, which they will not pay to ship back. It's a bit ironic that the one thing not going back via the UPS is the UPS. Since the UPS is custom-made for their gateway, I can't find any other use for it, although it does make a dandy (if rather bulky) doorstop. But I digress. I took the other equipment to the local UPS store, handed it to the gentleman behind the counter, and prepared to explain the various paperwork to him. Before I could, he yelled back to the shift supervisor "Hey, we've got more AT&T stuff!" So I said "I take it I'm not the first person to bring this stuff in?" and he responded "Oh, no, we get this all the time." Perhaps I'm not the only dissatisfied (ex-)customer out here.

Thursday, May 27, 2010

Setting Environment Variables for Netbeans

I use both NetBeans and Eclipse as IDEs for developing Java code.  In some cases (notably when the code employs the CPLEX optimization library), I need to set an environment variable at run time (in the case of CPLEX, to point to the license file).  Eclipse provides an easy way to do this as part of the run time configuration for the project, but for whatever reason NetBeans lacks this feature (a singularly dopey omission IMHO).

After considerable time spent with Google, it appears that if you need to set environment variables on a project-level basis, the best option in NetBeans involves hacking the build.xml file in ways that I cannot begin to fathom.  Fortunately, if the environment variable works globally (across projects), there's a simpler answer.  In the NetBeans installation tree there's an etc/ folder containing a configuration file (netbeans.conf).  Just add the line export key=value and the problem is solved.

Wednesday, May 26, 2010

Do We Fail Often Enough in O.R.?

I read an article today (in an alumni magazine) about successes in the physical sciences and engineering that occurred, either by serendipity or hard work, after one or more significant failures. The underlying thesis was that funding agencies have become overly risk averse, and their unwillingness to fund projects with that do not look like fairly sure things may be inhibiting new discoveries.

That got me thinking about failure rates in O.R. (and, to some extent, what it means to fail in O.R.).  I'd like to avoid the recent debates about what we mean by "O.R." and take an inclusive view here, one that spans "pure" research (which in my experience usually means developing an understanding of the mathematical/statistical properties underlying common classes of models and algorithms), "applied" research (which for me is mainly algorithm construction, and perhaps model tuning) and application (solving problems).

I'm not sure to what extent pressure to extract grant money from risk-averse sources is an issue, but I think that in the academic world the "publish or perish" mentality, and the related formula pay raise = pittance + lambda*(recent pubs), pushes professors into doing very incremental, sure-fire work and not taking shots at problems that could require years if not decades of effort, and might not bear fruit.  In the business world, O.R. applications are typically (exclusively?) decision support endeavors, and informed decisions need to be made now, not a decade down the road when some algorithmic breakthrough occurs.  So I think we are collectively quite risk averse, but the consequences are impossible to measure.

The other question that struck me is how we define failure in O.R.  We build models that approximate reality, and if close approximations prove intractable, we can usually make looser approximations (subject to an occasional bit of derision from actual decision makers, or academics who profess a closer tie to reality) (which is to say, not economists).  On the algorithmic side, if we cannot find the optimal solution, the precise steady-state distribution or average queue length, or what have you, we can usually come up with an approximation / bound / heuristic solution and declare victory (and, in the case of consultants, pick up the last check).  The closest thing to failure that I can point to in my own experience is having a manuscript rejected by a journal, and even then it's usually a partial failure:  I shop the manuscript to another journal lower in the pecking order and iterate until acceptance.  For practitioners, I suspect the most common manifestation of failure is producing a solution that is never utilized (something I've also experienced).

So, circling back to the original question, are there things we should be doing but are not, things that might directly or indirectly pay some significant social dividend?

Saturday, May 15, 2010

Using Impressive on Mint/Ubuntu

I've earlier mentioned Impressive, a very handy program for presenting slide shows done as PDF files (e.g., using the LaTeX beamer class).  Today I finally managed to get a couple of features working on my laptop (Linux Mint "Helena") that had eluded me until recently.
  • The tab key toggles an overview mode where you can see (and jump to) any slide in the show. Beamer's method of displaying bullets etc. incrementally on a slide is to create a new PDF page for each "overlay"; so a slide ("frame") with five bullet items would occupy five pages in the PDF file. That's fine until you use the tab key in impressive and see a few bazillion "slides" listed. The answer is to run impressive with the option -O first in the command line. Well and good, except who wants to open a shell in order to run impressive. So the answer was to right-click a PDF file, select Open With > Other Application > Use a custom command and set the command to /usr/bin/impressive -O first.  Simple enough.
  • Theoretically, with the right helper software installed (pdftk), you can click on a hyperlink in a PDF file while displaying it with Impressive, and Impressive will jump to the designated page (if within the document) or open the target in a browser (if the URL points outside the document). Small problem: it has never worked for me. A web search revealed that the version of pdftk (1.41+dfsg-1) included in recent Ubuntu repositories (and thus also available in Mint's package manager) is the culprit. Downgrading to 1.41-3ubuntu1 solves the problem. This requires three steps: uninstall 1.41+dfsg-1; download and install 1.41-3ubuntu1; and, in the Synaptic package manager, mark pdftk as not to be updated (otherwise Synaptic will try to upgrade it back to 1.41+dfsg-1 every time it finds updates, and it would be easy to forget to uncheck it in some mass update).

Saturday, May 8, 2010

R Goes (Further) Mainstream

I use R for pretty much all my statistical computing these days. The commercial world has been using R for a while now, but that's about to get easier. An article on Developer.com indicates that Revolution Analytics (formerly Revolution Computing) is producing an enterprise-ready version of R that will (hopefully) make it easier for businesses to adopt R and perhaps will let R handle larger data sets more efficiently than it currently does (not a concern for me). Reaction to the announcement has not been uniformly positive, but I'm inclined to think that overall it's a good thing -- assuming their plans come to fruition. We'll see.

Thanks to Larry at IEOR Tools for blogging about the announcement.