Delectable Duck!

Send to Kindle

My most loyal readers should already be salivating, as they have all personally had the experience I’m about to describe. 🙂

Those who know me know that perhaps my favorite restaurant (though there are many that I love) is the Peking Duck House in mid-town in NYC. I had dinner there last night. This isn’t unusual, as Lois and I have calculated that (including delivery) we have enjoyed somewhere between 500-1000 meals in/from that restaurant over the past 25 years!

I’m writing about it now because last night was the first time I was there since I started blogging, and the meals there are at least worth mentioning once. 😉

There are many restaurants in NYC that serve Peking Duck. There are relatively few that serve it outside of the major metropolitan cities. This is largely due to the fact that it’s a time-consuming process to prepare the duck (typically 24 hours!), and it’s expensive. So, restaurants don’t want to prepare a number of ducks, only to have them go unordered.

When your main calling is delivering delectable ducks to the majority of your customers, you don’t worry about this, and prepare an insane number of ducks every single day. Their home page currently highlights a picture of what the duck looks like sliced up, but I don’t know if they rotate the home page photos regularly…

Last night, Lois and I had the pleasure of going with a couple who had not dined there before. That said, both are Peking Duck lovers, who considered the duck at Shun Lee Palace to be “the best”. I’ve eaten there many times in my life, including having the duck there. I hadn’t been there in over 10 years, but went there for lunch a month ago with another VC who chose the place. It’s a fantastic restaurant, and I have nothing but praise to heap on them, but unfortunately (for them), their duck, as outstanding as it is, doesn’t measure up to the Duck House.

So, I will admit to being a little nervous as to how our friends would rate their new duck experience with “the best”. All I can say is that they are either very classy and very good actors, or I should believe them when they say that the duck they had last night was officially the best duck they had ever had (Shun Lee Palace included). Whew 🙂

Basically, this post is done, since it’s ostensibly about the duck. For those who clicked on this only for the duck, stop reading!

OK, if you’re still here, then you either know that the rest of the menu is fantastic as well, or you just couldn’t resist finding out what else I had to say about this restaurant…

Over the years, with as many meals as we’ve enjoyed there, we have developed a set of favorite dishes that never fails to satisfy. Amazingly enough, nothing that has ever been on that list has ever fallen off due to a decline in the consistency or taste satisfaction. Dishes sometimes don’t make the cut because there are too many great choices, and new favorites end up bumping down an old standby which simply won’t fit in your stomach during that particular meal.

Last night is a good example. Typically, a duck produces 10 pancakes. When there are only a few people sharing the duck, the waiters at the Duck House will stuff each pancake a little more, knowing that two people won’t necessarily want five pancakes each. Lois never eats the duck (don’t ask) 😉 so there were only three duck eaters last night.

That could have been three pancakes each, easily. For most people, that’s an entire meal. For sure if you add any soup/appetizers and rice, etc. But, somehow, we always manage to order more than we should there. Last night was different, as we wanted our friends to at least taste a few of our favorites. They were certainly game. 😉

So, we ordered our current top three non-duck favorites:

  1. Orange Chicken (undoubtedly the greatest version of this dish ever created, anywhere on earth, I’ll bet on it!). This has been my personal top choice for 25 years, and I’ve never waivered.
  2. Grand Marnier Prawns (a relatively new favorite at this restaurant, as it was introduced less than 2 years ago there. We were enjoying it 20 years ago at a long-gone place called Fu’s.) This dish is also mind-bogglingly delicious.
  3. Paradise Beef. This is their take on filet mignon. Lois and I are always amused when people realize they can get top quality steak, in a Chinese restaurant. This dish has been consistently outstanding for 25 years as well.

There wasn’t room (in any of us) to even consider other dishes, like the amazing Prawns in Chili Sauce, etc. In fact, as I mentioned above, I really thought we’d each just sample the above dishes. Not only did we devour all of the dishes (there was one measly strip of Paradise Beef left that I think everyone was eyeing but somehow thought that it would look like we weren’t gluttons if we left something on at least one plate). 😉

Of course, then there was dessert (Lois and I are such regulars over the years the we always get dessert on the house, whether we want it or not!). Green Tea ice cream, Red Bean ice cream and lychee nuts (and of course, chocolate and vanilla ice cream as well). I’m very proud to say we only finished 1/2 of the dessert… 😉

All in all, a fantastic evening of great food, even better company and yet another fond memory of a meal at the Duck House. I’m sure we added two more regulars to their roster…

P.S. Now for another irony that the universe loves to throw at me. I had considered blogging about this even before we went last night. When we got home, I was sure that I would do it when I finally had a second to breathe today (which turned out to be mid-afternoon). But, in the morning, a good friend, fellow VC, many-time co-investor with me, LP in my fund, etc., etc., etc., wrote out of the blue to say that he and the family would be in town in two weeks, and could Lois and I join them for a meal at the Duck House. Guess who first introduced him to it? 😉

Unfortunately, we’ll be down at Zope on the day they come to town, but they have already confirmed that they’re going without us, so apparently we are not the primary attraction in this city… 😉

I’m a Phone Weenie

Send to Kindle

OK, I admit it. Rob Page, CEO of Zope Corporation officially coined me a phone weenie a number of weeks ago. He’s right. Of course, he also directly benefits from that feature in me, since I am the primary Asterisk PBX administrator at Zope Corp. 😉

So, what does it mean to be a phone weenie? Basically, it means that I get excited about every possible way to interconnect phone systems, even where there are little practical uses for doing so. 😉

For example, here are a variety of ways that I have at my disposal to “speak on the phone” (and by phone, I often really mean headset connected to my laptop). First (and in this case foremost), I run multiple Asterisk PBX servers. One in my house, one in my apartment, one on which is in a data center in Virginia, one at Zope Corporation (which I’m in charge of), and one or two other ones that I’ve set up for friends. This is my primary way of getting phone calls. It’s also the only way (other than her cell phone) that Lois makes phone calls.

I (on the other hand), rarely make PSTN outgoing calls via Asterisk any longer (though I certainly used to, for years). In addition to Asterisk, I’ve had accounts with a variety of VoIP providers (mostly SIP) for years, including: FreeWorldDialup (FWD), SIPPhone (also providers of the Gizmo Project), etc. Of course, I’ve also had a Skype account for years.

Being a phone weenie, I eschewed the use of Skype for a long time, and only launched it when someone else specifically asked me to have a conversation via Skype. Why? Because Skype isn’t standards compliant and therefore can’t interact (easily) with other systems, including SIP, Asterisk, etc. That said, everything that people said about Skype was also true, namely that it was easy to use, had great sound quality, could get through most firewalls (which is why corporate America rightfully hates it!), etc.

Last year, Skype offered 7 months of unlimited calling to any US or Canadian land-line or cell phone (using their SkypeOut service), if you were a US or Canadian based caller. Who could pass on free calls? So, I used Skype a number of times for that. For the most part, the quality was good, so when I remembered to, I’d use it for long-distance calls. The only downside was that I wasn’t sending my own Caller ID. To some, that’s a positive, but since I am not a telemarketer, I don’t mind people knowing that it’s me that’s calling. 😉

In January of this year, Skype stopped the free service, but for $30/year, gave the same unlimited calling. But, if you act now (or rather before January 31, 2007), you could get the same service for $14.95 for all of 2007. I did. It works. I like it.

But, I still didn’t launch Skype unless I was about to make a long-distance call, or someone asked me (typically via email) to get on Skype. Lately, I’ve been doing some business with a company in Europe. Of course, no one wants to pay for International calls. It turns out that they are all Mac users, and all had Skype installed already. So, rather than fighting the tide, I now launch Skype regularly (still not automatically at login), and I speak to them on the phone frequently, and use Skype IM more than I thought I would. It’s still a very nice package with excellent sound quality.

But, that’s not really the purpose of this post. As usual, I type way too much before getting to the point. I don’t mind, because I’m only blogging for myself at the moment, and these are the specific thoughts that I want to capture for myself for posterity, for now…

Here’s the point of this post 🙂

When it was first announced, I got very excited about Google Talk (Gtalk). I thought it would be a formidable opponent to Skype, but would be more standards compliant and therefore interoperable. Well, there may be a ton of people using it, but none of the people that I interact with have ever asked me to have a conversation with them via Google Talk! Ironically, Gtalk licensed the same codec that Skype did (from Global IP Sound) so their sound quality is pretty exceptional as well.

In any event, Gtalk was supposed to be more interoperable because they were based on XMPP (which is the protocol underneath the fantastic Jabber IM service). That has finally turned out to be true, but it was a long time coming. The fact that it can interoperate with other Jabber servers is of minor interest to me personally. I run my own Jabber server, and can interact with any other Jabber server on the planet that turns on that feature.

What has been interesting to me (in theory) is whether Gtalk can be a client with critical mass distribution (like Skype is already), but be able to work seemlessly with things like Asterisk (and other existing and as yet to be developed services). Google has the brand and the muscle to move as many free clients as Skype, so my dream lives on.

So, the latest version of Asterisk (1.4.x, which just became 1.4.1 this week) has support for Gtalk built in. I don’t know whether it works well or not, because I run the more proven 1.2.x branch (now at 1.2.16 as of this week). That doesn’t have the support.

Today, I read a posting on the Asterisk mailing list by someone who used a new service to connect the two, without installing any software on my server or my client! What? Yes, that’s right, and I have to say that I got it working pretty easily, and it’s usefulness is much greater than simply connecting Asterisk to Google Talk.

The service is called gtalk2voip. It’s extremely cool, and completely free for basic services, which is all I am interested in! It is a gateway service that knows how to communicate with Google Talk, MSN Live Messenger, and Yahoo! Messenger. In addition to being able to send IM (through gtalk2voip) between the services (I know, big deal), they can connect voice calls between the three services. That alone would be cool, though I suspect that over time, the services will connect to each other (as many IM services have done in the past, reluctantly). But, what this service can do (and trivially!), is connect any of the three services to any SIP address on any arbitrary SIP provider (including Asterisk, which has built-in SIP support).

So, I clicked one button on the home page of and typed in my gmail address (which is tied to my Gtalk service). One second later, I had a message in Gtalk asking me if I wanted to add a new Contact “”. Once I said yes, the new service IM’ed me with instructions on how to use the service (cool). I made a 2-line change to my Asterisk configuration (many wouldn’t need to even do that), and I was able to make a call from Gtalk to my home phone, entirely through SIP (Asterisk controls all of my phones).

I was then able to add a new extension in Asterisk, and when I dialed it from my home phone, Gtalk started ringing. It just worked. No downloads, no real work. If you have a SIP address on any service (FWD, SIPPhone, GizmoProject, or your own server, etc.), then you have zero configuration after you’ve accepted “” as your new contact. I wasn’t publishing a SIP address to the world, since my Asterisk gets called when people dial my real PSTN number, so I had to add a few lines to expose a SIP address…

OK, so this is way cool, except that I still don’t think Gtalk has really caught on yet. But, if it does, I’m already ready, which is wildly exciting.

Now for a bit of irony to conclude this long post. For the past 10+ years, perhaps closer to 15, I have been a member of a computer book club which has had a number of names, now called Computer Books Direct. It’s a pretty darn good book club, but since I stopped working as a programmer, I haven’t bought a book from them, which has been 9 years now. I haven’t canceled my membership because I have 38 member credits which I can use for either free books, or greatly reduced pricing. Some day, I suspect I’ll use them. In the meantime, I have to watch my snail mail like a hawk, to ensure that I don’t forget to cancel this month’s selections. I haven’t missed in at least 8 years, so I’m good at it.

Today (that’s right, today!), after I started writing this post, but before I finished it (obviously), the mail came, and in it was my monthly selection that I needed to cancel. There were two selections today, both about Google. The top selection was a book called Google Talking, I kid you not! Here is a link to their page. I’m not buying it, but somehow, somewhere, the universe is nudging me in that direction… 🙂

SPAM is back under control

Send to Kindle

I know, what a silly thing to say, and in public at that!

Previously, I posted on my woes in having old SpamBayes filtering starting to fail and new procmail rules that I foolishly put in myself causing me to suffer from spam more than usual. I am happy to report that it’s back under control, mostly thanks to the fact that I’ve finally spent some time studying other people’s procmail rules, and learning a bunch of techniques that I was previously unaware of. The truth is that I was a complete luddite in terms of using procmail in a completely vanilla fashion.

My biggest single breakthrough was in realizing that I could run any set of tests against any arbirtrary file, rather than having to wait for an email to come in and see whether my new test worked or not. Doh! So now, when an email comes in that I believe I can trap in the future, I copy it out, write some rules, run procmail against that file until I’m happy, and insert the rule into my real procmail file. Cool!

Paying appropirate homage to the ones I learned from, here are two sites that got my juices flowing:

The first site above concludes by saying that you don’t need to know what he just taught you, as you can install SpamBouncer instead. I installed it, and I have to say it’s a mind-bogglingly sophisticated system. I can’t believe how much work has gone into this. That said, while I learned a lot there too, and will probably go back again and again to some of the recipes and techniques in there, I am not going to “put it into production”.

Why? Essentially three reasons:

  1. It’s incredibly slow in processing messages (understandably so!) as it goes through more tests than you can shake a stick at (and possibly connects to outside servers as well, but I’m not 100% sure about that yet).
  2. It’s very cpu intensive. I could likely live with the slowness, given how good a job it does, but I run many other things on the machine, including some sensitive applications (e.g., Asterisk) so I prefer not to load the cpu when possible.
  3. The last time SpamBouncer was updated was 4/16/2006. It’s too large a system for me to want to actively maintain, and given the speed with which spammers morph their capabilities, it’s simply easier for me to toss in a new rule or two into my own anemic set of rules.

Anyway, all I can say is hooray 🙂

Now, in case it hasn’t been completely obvious to my numerous readers, I’m a relatively passionate person (OK, let’s not use euphemisms, the correct term is obsessive). As a result, much of my recent “free time” has been spent in this pursuit. As a result, my other previous obsession (online Poker), has taken a back seat. I have played a total of 2-3 hours of online Poker in the past 3+ weeks. For those who know me, that has to be a shocking fact 😉

Noting the above, it’s clear that there are other ways to solve this problem. Notably, my good friend, and one of the most tech-savvy people I have ever meet, Jamie Thingelstad, chose to throw in the towel. I completely understand his decision, and might arrive there at some point in the future. Still, the contents of my emails are the heart of my business, and I can’t imagine parting with them and putting them in someone else’s care (at least not yet). He, and many others, have been trying to get me to switch to a Mac for years as well, and that hasn’t happened yet either 😉


In Praise of SIDUX

Send to Kindle

I said I wouldn’t write about the old Dell Latitude L400 again, unless there was a “breakthrough”. Well, there hasn’t been one, so I’m not really writing about that machine (but of course, I really am) 😉

Really, this is more of a Linux posting than a specific laptop one…

So, unless you’re a real geek, or someone else is making decisions for you, it’s simply mind-boggling how many Linux distributions (“distros”) there are out there. For the techies at heart, perhaps that’s a good thing, as there has to be some distro out there that is preconfigured to your personal taste. Statistically speaking, since there seems to be at least one distro for every three people on the planet, that statement has to be true, right? 😉

No wonder that Linux doesn’t catch on with the mainstream public. I don’t pretend to understand all of the issues with why one should pick one distro over another, nor why so many distros get “forked” or built on top of, etc. I listen to what other geeks have to say (many of whom work at Zope Corporation) and hear which ones they use and wish others did, etc.

Anyway, back to the old laptop, of which this post is ostensibly not about 🙂

I decided to completely abandon any hope of using it for the originally intended purposes (Poker and remote Slingbox machine). Still, I was happy with it for the short time that I was using it as a guest browsing machine in my apartment. At that time it was running Ubuntu 6.06. A number of the developers at Zope Corp swear by it. In all of my trials and tribulations on this laptop, it was by far the most stable and least hassle to get running.

So, I decided that I would wipe out all traces of my previous attempts to install a dozen other operating systems, and go back to Ubuntu 6.06. Of course, I couldn’t get the same exact CD that I previously used to install again. Go figure… 🙁

I downloaded Ubuntu 6.10 (the current stable release), and it took me numerous tries before I got it installed. When I did, it wouldn’t stably boot, even with APCI=OFF (which wasn’t necessary to add the last time I successfully installed Ubuntu).

Those of you who have read my previous rants on this laptop know that I had hoped to use PCLinuxOS as my personal distro. Apparently, they still have “problems”, and my setup was not spared. In particular, I could never get my WiFi card working, and it’s an Orinoco, which you’d think every distro would correctly support, if they do any WiFi at all…

So, on to find another distro, or finally, toss this machine into the actual garbage dump.

Looking on the wonderful Distrowatch site, I happened to spot a distro called SIDUX. It had all of the features that appealed to me with PCLinuxOS, but was Debian-based (like Ubuntu), and I’ve had good personal success with Debian-based systems in the past, including Xandros.

All I can say is wow! It just works. It installed correctly the first time, had the Firefox (I mean Iceweasel) update when it came out, etc. I had it up and running for three straight days (not under any particular load, but still, pretty amazing for this particular machine). It has crashed a few times, so there’s still something sickly wrong with the laptop, but this has been by far the most stable operating system that I’ve had on the box, and it’s attractive and reasonably peppy (on ancient hardware) to boot.

I’m pleased that I now have a reasonable guest browsing machine, and I do not intend to dork with it any longer, and this time, I’m serious. 🙂

P.S. On SIDUX, I was able to connect to a remote RDP server, over NAT from a hotel, using an SSH tunnel, and play in one full Poker tournament without anything crashing. The graphics were painful, so I won’t be doing it again, but the fact that I could do it at all was shockingly cool! Unnfortunately, I finished one of out the money (bubble boy), so SIDUX didn’t help in that regard… 😉

SPAM Problem Solved!

Send to Kindle

OK, not really. That was only a slight exaggeration 😉

Seriously, the specific spam problem that I complained about in my “technology is random” posting is what I’ve now solved.

As I mentioned in that post, I had a combination of procmail rules and SpamBayes filtering, etc. I completely turned off the old SB filtering, because at first I thought that somehow it was causing the emails with attachments to be deleted. Only when I did that, did I notice that it was throwing away other emails simply because it was incorrectly tagging them as certain spam (score of 1.0). I couldn’t believe that, but like I said, since I wasn’t updating the db, it was degrading.

So, I turned off the SB filtering, and still, emails were being sent to /dev/null on the server if they had large-ish attachments. That meant that one of my other procmail rules was kicking in. I looked at each (I have many) very closely, and couldn’t imagine which might be causing this.

Also as mentioned in the previous post, I temporarily fixed this by creating a procmail-based white list, which (unfortunately) was both after the fact, and growing steadily.

I also went back and with a few carefully crafted grep and tail pipelines, was able to identify other emails that had quietly been thrown away, and then contacted those (very surprised) authors, and asked for a resend.

OK, on to the solution (almost). Yesterday, an old boss of mine (no, he’s not that old, but I haven’t worked for him directly since 1989!) asked me to review a 384 page document that he had written (no, I’m not kidding about the size). People who know me, know that I (and Lois) are like an echo when it comes to email (think “ping pong”). When he didn’t get an acknowledgement from me within an hour, he assumed that something was wrong.

He sent me another email, asking if I’d gotten the file. Of course, /dev/null had eaten it…

I white listed him, and got the file (which is how I know the size, as at first he scared me by telling me that it was 400 pages) 😉

That got me to thinking that I now had a specific attachment that I knew would fail. I ended up sending it to myself from an account that wasn’t white listed. It got thrown out immediately. Bingo! Now I was at least in control of my own destiny, since I could provoke the problem any time I wanted to.

The next step was easy (and obvious). I turned on verbose logging in procmail and resent the email. You might ask “Why the hell didn’t you turn on verbose logging earlier?” Good question. Aside from not really thinking about it, I must have known (intuitively) that my disk would have filled up waiting for a “bad” email to come in and provoke the problem. Even asking someone to resend would have an unacceptable lag in waiting for them to see my email and act on it, etc.

Logging showed that I was being completely stupid in one specific rule. As the rest of you must know, one of the most popular email annoyances are the pump-and-dump stock schemes. They promote a specific stock as the next moon shot. Many are traded on an exchange with a code of PK (for the few of you who don’t know, that’s the Karache Stocke Exchange in Pakistan, a place where I am dying to find a good stock deal!) 😉

So, I started a little procmail rule that added any symbol in those emails that I was sure (and here comes my ultra-stupidity) that couldn’t occur in a normal email. So far so good, right? As an example, let’s say that one of the symbols was “JMNX.PK”. Come on, would I worry about accidentally deleting an email that had that string of characters in it?

Well, mistake number 1 (the tiny one) is that without escaping the “.” in the above symbol, it would have substituted for any character, so if a buddy sent me an email saying “Howdy, check out JMNXOPK”, I would never have seen it. Hopefully, I’d survive such a faux pas. But, over time, I added shorter symbols. Notably, one was PHYA. Again, I wasn’t “worried” that someone would send me a legitimate email with that in it. This was mistake #2, and clearly the biggie…

When someone sends you an attachment, it gets encoded, typically in base64, which is an ascii encoding. That means that it is converted into a series of apparently random characters. The bigger the attachment, the more of these random characters, and the more likely that any 4-letter combination will appear.

So, it turned out that the 384 page document had the string “pHYa” in it. Note that procmail was kind enough to be case insensitive so that “pHYa” matched my input of “PHYA”, reducing the number of random combinations I had to sweat out.

Of course, in retrospect, I was an idiot, and the inevitability of the match is obvious. The solution is trivial too: delete the rule 🙂 Now that it’s gone, it’s just as simple to add at least another step to check for any number of other typical pump-and-dump keywords along with the ticker symbol, and that should work just fine. In the end, it was both laziness on my part, coupled with the fantasy of catching every occurence of that particular type of email that did me in.

All I can say is amen, a modicum of sanity has returned to the world…

Debugging Firefox and WordPress 2.1 UI issues with Firebug – SUCCESS

Send to Kindle

I’m typing this to provoke the following error:

uncaught exception: Permission denied to call method

So far, not provoked, which means that I think I figured out what was killing me ever since I upgraded to WordPress 2.1.

Before I declare victory, let me try to insert a link, which is what I was having the most trouble with before. Yippee!

The part of the post from the above link that made the lightbulb go off in my head was where megatron5151 points out that links with “” were different than “” for purposes of the browser thinking that there was a cross-scripting domain issue.

I realized that I was redirecting all URLs with either wp-admin or wp-login in them to require https. That made the base URL of the site different than the administrative part of the site (only due to my redirection, not because of anything that WordPress was aware of!), and so the new AJAX niceties that were introduced in WordPress 2.1 were being turned off by Firefox (correctly!), like autosave, etc.

The simple solution (and I’m not sure whether I have compromised security here or not, so if anyone is indeed reading this, which I doubt, and you know the answer, please let me know!), is that I made only the wp-login redirect to https, and once logged in (presumably avoiding my password being transferred in the clear over the wire), I revert back to plain old http. If the rest of the authentication is done via cookies, or sessions ids, I guess/hope that I’m fine from a security point of view. If not, then I guess that Firefox will be continually sending my password in the clear in the background (unbeknownst to me), in which case I need a better long-term solution than this.

In the meantime, I am immensely relieved to have put this headache behind me. Further, it turned out to be an interesting first use of Firebug, which is clearly awesome 🙂

P.S. I am very happy to have been able to link to Firebug as a result of it helping me to track down this problem 😉

Why does most technology feel “random” so often?

Send to Kindle

I’ve been involved professionally with technology since 1980. So, you’d think that I understand it (and how it works) reasonably well by now. On some levels, sure, but on others, I feel as helpless as the proverbial mother-in-law or grandparent in the “clueless users” examples people always give…

Conceptually, I understand how “small tweaks” can lead to large unexpected results. It’s a variation on chaos theory. Practically, it’s still annoying. What is harder (for non-techies) to understand is when things break down after no changes (that they are aware of!). Of course, it’s the parenthentical comment that is the clue.

With modern operating systems, the vast majority of users have some form of automatic updates turned on. That being the case, things are chaging frequently, and possibly in very significant ways. It just so happens that the user doesn’t associate different behavior in their favorite applications with an invisble update.

The above was just generic whining to get to one or two rants that have been bugging the hell out of me lately…

The first topic is spam filtering. For many reasons (most of them rational ;-)), I am Windows user (specifically, WinXP Pro, but that’s not important). I don’t think it’s superior, etc., but many applications that I find convenient (and in some rare cases even necessary) are always available first on Windows, and often only on Windows… C’est la vie…

So, being a comitted Windows person (no, the irony of that statement doesn’t escape me ;-)), for many years, I was a tried and true Outlook user. In fact, I started with Outlook 97, moved to 98, then 2000, and then 2003 (no, I didn’t have the pleasure of Outlook XP).

In the early years, there was no need for spam filtering. Not only was the volume of spam low, my Internet activities were reasonably limited, so I wasn’t on many spam lists anyway. Of course, being a VC now, and having my name on many public sites, along with being subscribed to many mailing lists (public as well as publically available internal company lists), has changed that fact melodramatically.

On some days, I get well over 1000 spam messages (through the variety of means that email can wind up in my real account). Clearly, that isn’t a sustainable number of mails to have to delete by hand (even though I am ultra fast at spotting spam and hitting the Junk key).

So, a few years ago, I installed the free SpamBayes plug-in for Outlook. (This now requires a minor side-rant) 🙁

<Side Rant>

Ever since I upgraded to WordPress 2.1, I can’t create any links with their “visual” tab. I wanted to link to the SpamBayes project page above, and got a blank pop-up box where the form is supposed to be. Firebug shows errors with TinyMCE, and before that, an error with an XHTPPRequest, so it’s likely Firefox config that’s causing the problem, but I have no idea whatsoever what else to try (obviously, I’ve tried a lot of things…)

</Side Rant>

So, I ran SpamBayes for a long while, and also ran a commercial derivative of it, InBoxer (should have had a link to that as well…)

It did a pretty good job. Still, it wasn’t all that satisfying, because every message needed to be downloaded to my laptop, before SpamBayes (SB) could analyze it. That meant that on a heavy spam day, if I was on a slow link (let’s say dial-up, gulp), I had to wait for all of the spam to come down to find the few gems that I was breathlessly waiting to read.

So, after doing that for quite a while, and building up a large SB db, I decided to get creative. I installed SB on the server as well (I control my own server), and regularly uploaded my local (meaning laptop) SB db to the server. Then I added a procmail rule that filtered each message using the locally trained db (but now up on the server), and then did one of three things with the result:

  1. If it was marked as “ham” (definitely not spam), it was just passed through normally.
  2. If it was marked as “unsure” (the range is user-definable), then it was moved to another account on the server, so that it didn’t auto-download on each email check (this solved the problem of slow links with lots of possible spam)
  3. If it was marked as “spam”, it was deleted right then and there on the server.

This worked very nicely for quite a while as well.

Then, I woke up, and decided to break myself of the Outlook Addiction. I’m still firmly in the Windows world, and have been ever since I decided to stop using Outlook for email (over 2 years ago now!). Even though I own a legal copy of Office 2003, I now only use Outlook for Calendar, Tasks and Notes, and that only because it syncs reliably with my Treo 700p.

I switched to Thunderbird, and have never regretted doing that. I’ll save any niggling complaints about Thunderbird for some future post when I am really bored, since for the most part, I am extremely happy with TB.

Now, the first part of the problem. TB has built-in Junk filters, which work OK (but not that great), but that puts me back to having to download everything to have it analyzed. The second part of the problem is that I can continue to use the old (static) SB db on the server to help cut down on spam, but the real beauty of SB is the B (Bayes), which continually learns. Since spammers constantly change their strategies to stay ahead of the anti-spam companies, having an outdated SB db degrades its usefulness over time.

Wow, I can’t believe how much background I just gave in order to get to the actual point…

Recently, emails that were previously being marked as “ham”, or “unsure”, were getting tagged as guaranteed “spam”, meaning SB was assigning them a spam score of 1.0! Of course, my server-side filter was dutifully tossing them to /dev/null as instructed, and I was blissfully unaware of that.

I discovered that when another phenomenon began. Any emails with large attachments were going directly to /dev/null. Since most of my procmail rules are also duplicated for Lois, she was complaining before I noticed, that people were writing tons of “follow up” emails to her, wondering why she hadn’t responded to their last email. Those follow up emails were getting through, because they didn’t have attachments. I am still not sure that this was because of the old SB db, but at least that caused me to find the other emails that were definitely being miscategorized…

In any event, I turned off the SB db, and the flood of spam started up again. About a month ago, I turned off SpamAssasin on the server side, because while it was somewhat effective, it was also one of the biggest resource hogs I had ever seen on the server, and the “reward” wasn’t worth it…

So, now, I’m spending a little too much time hand-tuning procmail rules to get the spam back down to a mangeable range. So far, so good, but with lots more effort than I would have hoped to expend, given the nice steady state I had for a reasonably long time.

Anyway, this post has turned out way longer than I expected, so I will save the other “random” events for some future post, when they bubble to the top of my frustration queue.

P.S. I am still not sure I’ve “solved” the large attachment problem. My temporary solution was to specifically whitelist those senders in procmail, which works, but begs the issue of whether others are being thrown away that I’ll never find out about, or find out about too late 🙁

I’m truly a glutton for punishment…

Send to Kindle

OK, so here I am again, updating my progress on the old Dell Latitude L400 saga…

Before I begin, let me assure the millions of readers of this blog that I have no illusion that I will ever be able to use this laptop in anything other than as a “guest surfing” node in my apartment. It’s not the case that I think if I just “bull through this”, I will figure out how to make it reliable for the originally intended purposes…

So, why do I continue banging my head against this rock-solid wall? Because I like to understand things, even things I can’t make work. Since some of the failure modes of the machine are reproducible, I have an inclination (or perhaps fantasy is a better term) that I can at least figure out what is failing. Even if I do, there’s no way that it will be economically viable to “fix it”, but knowing will make all the difference (to me at least) 😉

Deciding that it was worth the minor effort to turn this into the equivalent of a “thin client” browsing machine, I wanted to pick a Linux distro that would require little tweaking for my purposes. While Ubuntu 6.06 was working reasonably well on the machine previously (fewer halts than Windows), I wasn’t crazy about putting it back on. This actually introduces a little side rant…

(Pardon the interruption of the real point of this thread…)

I get that Linux distros want to offer some sort of “stability” promise, and as such, are supposedly careful about upgrading apps too aggressively. There’s a general “goodness” associated with that concept. That said, it’s also annoying that there isn’t an easy way to over-ride that.

An obvious example is Firefox. Ubuntu 6.06 ships with Assuming that it is correct not to upgrade to 2.0.x, is it therefore correct to not automically update (through their automated updating service) to After all, the concept here is those are bug releases only (no new features).

It’s bad enough that I can’t just get that update automatically, but I’m not even given a choice to upgrade to it optionally, even with a warning that it hasn’t been fully tested. However, what really bugs me (again, remembering that I understand the concept of stability) is that within Firefox itself, logged in as root, I can’t hit the “Check for Updates” button, as it is greyed out. Obviously, I’m being saved from myself, and I don’t like it.

(OK, back to our regularly scheduled saga…)

So, I wanted a distro that had the latest versions of Firefox, Thunderbird and Even Ubuntu 6.10 doesn’t qualify. The current “snapshot” does, but then I don’t get the easy-to-install-from CD, etc.

A little searching, and I found PCLinuxOS version 2007.1 test release. With a little trouble (all caused by me, not PCLinuxOS!), I got it installed. I’m done then right? Wrong! Why? Because, PCLinuxOS doesn’t start up the wireless networking correctly (it works fine with the wired port). Somewhere, deep in the bowels of the kernel, it recognizes my Lucent Wavelan (Orinoco) card, as “dmesg” shows that there is a card inserted in slot 0. All attempts to load the right things (including: “modprobe hermes”, “modprobe wavelan_cs”, “modprobe orinoco_cs”) fail to make the card work, but all succeed in loading the appropriate modules.

Anyway, I boot with Damn Small Linux 3.2 (DSL), and from the Live CD, it recognizes the card correctly, and can access the Net just fine. Damn (so to speak) 🙁

This is likely a general problem with this specific distro (PCLinuxOS), as they have deferred their final release due to the number of problems found in the test release…

So, I’ll let this one go for now (though my general complaint about distros upgrading to current app releases within a bug-fix range stands!) 🙂

So, back to the original intent of this post, which was to explain why I even bother to continue working on this machine, when I know it can’t fulfill my true need for it…

It turns out, that when I put PCLinuxOS on the machine, and found out that it didn’t work with my wireless card, I decided to “upgrade” from their repository, hoping to install additional wireless tools. That worked fine, with one notable exception.

No matter how little or how much I asked for the updater to download (the largest was 80 files with 38MB, the smallest was 12 files with 2+ MB), it downloaded them all just fine, but the instant it started to process the files, the machine halted! Sound familiar?

It then occured to me that every time the machine halted, it was doing something related to the network. This was across operating systems (Windows XP, Windows 2000, Ubuntu 6.06, PCLinuxOS, etc.). So, perhaps a bad NIC? I doubt it highly. Why? Because in the case of PCLinuxOS, it was the wired port, in the case of Windows 2000, it was operating 100% wirelessly, etc. Also, it was able to do hundreds of MBs and hundreds of individual updates for Windows XP and 2000. It was specific use cases of networking that caused it to crap out!

In the PCLinuxOS case, after rebooting and starting Synaptic again, and asking it to update again, it found the packages locally (intact!), and installed them all without problem. On the next set of updates, it failed in exactly the same way. Download them all cleanly, and upon attempting to install them, the machine halted.

So, I’m no closer to understanding the what is happening, but it seems to be related to some kind of networking issue, perhaps related to task-switching from networking to disk, etc. Who knows…

I’m hopeful that this is the last post on this specific laptop (as I’m sure the throngs of my readers will appreciate!) 😉

Fifth time’s the charm :-)

Send to Kindle

I googled to see whether other people were having trouble with the upgrade to WordPress 2.1. Most were successful on their first try, but a few people had problems that were similar to me.

I made two changes and tried again:

  1. I ensured that my user had full permissions on the DB
  2. I performed the upgrade from a fresh directory untar of 2.1 (previously, I untarr’ed on top of 2.0.7)

I doubt that #2 had any effect, but I may as well be complete in my report.

This time, everything went fine. This post is being brought to you courtesy of WordPress 2.1. Whew 😉

Just joined Technorati

Send to Kindle

I thought I might keep this blog a secret, since I have nothing useful to say (seriously!). But, on the off chance that I can ruin someone else’s day by having them read this, that might make me feel a tad better about the things that I am ranting about to begin with 😉

Technorati Profile

This is the post where I’m claiming my blog with them…