Manatee of Cotuit

On Sunday no less. Off of Ropes Beach and the yacht club while I puttered on the yacht. This sea cow is a long way from home and the animal rescue squads are searching for it before the water turns chilly.

Native v. Web Apps

Dries Buytaert has posted a disquisition into the friction of native apps and the promise of frameworks such as Ember to further extend the limits of HTML and web site models to better ape the power of native mobile apps in the sense of those available for the Android and iPhone platforms. Native apps fracture the user experience and are pernicious enough in some circumstances to inspire the wonderfully grumpy Tumblr: I Don’t Want Your Fucking App.  Dries sees great potential for web apps to match the better functionalality and hardware integration of a native app — think of going to to pay for your next skinny crappachino vs. loading their app and waving a bar code in front of the scanner — as the capability to use Javascript front ends puts the client — the browser — in more control of the functions of a website. This could, if adopted, be as transformative to the future of the web as the advances from over ten years ago when LAMP  (the four open source horsemen of  Linux, Apache, MySQL and PhP) and Rails turned Gmail into an Outlook killer.

I’m all for it. The revolution of responsive design did away with the two-front battle of maintaining desktop and mobile versions of websites — no more “” versions are needed if one follows the expandable precepts of responsive which permits a site to conform to the form factor of whatever device. Why do I care?

Somewhere there exists a list of life’s stressful events — new job, moving, divorce, death of a loved one — and I propose the addition of “moving to a new cell phone” as one of the more soul-crushing, tantrum inducing experiences with the hash tag of #firstworldproblems. My faithful HTC M9 was flaking out due to a weary USB port — charging or not charging depending on the barometer and how badly I depended on it not shutting down when I needed it the most. So, in the spirit of my friend who, bullshit over a lemon of an outboard motor, gave up on the warranty and efforts of the dealer to fix it and just went out and bought a new one, prompting his son to say, “You showed them Dad!” — I bought a new, unlocked phone from HTC and just went through the tedium of moving my life from one over-priced rectangle to another.

Sure, there are transfer tools — and yes, eventually all my old apps and content moved from the retired phone to the new one, but…..And this is a big but that has nothing to do with apps versus web, but one tied to the blinding realization that while the apps and content transfer over — the log in credentials do no. So for hours I have been resetting passwords, transferring payment details, and cursing the total idiocy of passwords, capchas and two-factor authentication.

Anyway, Dries’ post is worth a read. From his post:

“Native applications versus web applications

“Using a native application — for the first time — is usually a high-friction, low-performance experience because you need to download, install, and open the application (Android’s streamed apps notwithstanding). Once installed, native applications offer unique access to smartphone capabilities such as hardware APIs (e.g. microphone, GPS, fingerprint sensors, camera), events such as push notifications, and gestures such as swipes and pinch-and-zoom. Unfortunately, most of these don’t have corresponding APIs for web applications.

“A web application, on the other hand, is a low-friction experience upon opening it for the first time. While native applications can require a large amount of time to download initially, web applications usually don’t have to be installed and launched. Nevertheless, web applications do incur the constraint of low performance when there is significant code weight or dozens of assets that have to be downloaded from the server. As such, one of the unique challenges facing web applications today is how to emulate a native user experience without the drawbacks that come with a closed, opaque, and proprietary ecosystem.”

Something happened: the Perils of Self-Hosted Blogs

I was at a funeral on Labor Day here in the village and a few loyal readers of this blog asked me what knocked me offline for so many months.  I think the suspicion was I was suffering from writer’s block, but  the true explanation was a heavy hacking of my  server by some spammers who injected the domain with about 30,000 spam sites linking back to purveyors of porn, affiliate programs, diet plans, and content farms.

My old Internet Service Provider (who I won’t blame because it’s not their job to provide me with a hardened, secure site) had to disable the entire domain because I was on a shared server with other customers  and they were seeing their sites slow down as the evil spam douche bags filled up all available space on with their crap sites. I’d call the ISP, get tech support on the phone, ask them to turn it back on long enough for me to save 15 years worth of writing and migrate the entire database to

Even as I cleared out the bad sites, patched the code, applied security measures, and did my best to defend the old blog, I could see the jerks injecting site after site even as I was logged in. Passwords were changed, everything short of hiring an expert was considered, but in the end I had to say goodbye to the platform that kept me happy for the past 18 years.

I self-hosted way back in the 1990s because I wanted to be more hands on with web content management and server operations when I was running and Reel-Time, my old saltwater flyfishing site. Knowing the rudiments of HTML and web management were important skills for my career back then, and the experience helped me satisfy the nerd manque in me. Self-hosting was never easy, especially in the early days of WordPress when the ability to automatically update the codebase wasn’t possible and I had to download patches and new versions myself, and update the blog myself. I initially was on Blogger — the blog platform acquired by Google. But Om Malik persuaded me to jump onto WordPress in 2001 and I was a fan from the very start. I got nailed by an xmlrpc hack in 2005 and lost the site for a while to some hacker, and many a time I shot myself in the foot with some rogue plug-in that required my friend Mark Cahill to swoop in and save the day.

The lesson I learned from this most recent series of hacks and frustrations is that security is a very real issue for any site owner, so much so that I can’t believe a layman such as myself can survive for very long without a managed hosting provider to provide a layer of security and oversight that a casual blogger just can’t bring to bear. The scuzzier elements of the Internet — the spammers and link farmers and affiliate marketing scum who prey on other sites to build link juice to their own money making schemes, the ransom artists, the script kiddies who prowl around looking for old unpatched sites and then infect them like some toenail fungus…. eventually they’re the ones that are going to crush the notion of the Open Web as independent creators like myself get fed up with swatting down their efforts to hijack our content and traffic so they can make a few pennies off their new get-rich-scheme.

The real shift is also in ISPs. The days of dumb rack hosting — where you get nothing more than “ping, power, and a plug” are done. Where I work, Acquia, the value to the customer comes from running their sites on a hardened platform that is monitored, managed, and patched by experts who can diagnose problems and fix them.  When I lost in the fall of 1999, the hosting provider was useless when it came to diagnosing the problems that were causing the site to flatline under an extraordinary spike in traffic. All their Network Operations Center personnel could do was confirm the server was powered on and connecting to the Internet. It took four days of a dead site and a lot of anxiety before someone was able to identify the problem came from too much stress on our ad servers.

When a seriously critical site — like a newspaper during a big news event — goes dark, it’s not just the site owner who suffers from the outage, it’s the audience who need the site to be available who also suffer. Failure on a web site is not just an inconvenience to a hobbyist blogger like myself, for big e-commerce operations, government agencies, news outlets — an outage can be disastrous.

But what about the casual user? Does the need for a simple platform even matter anymore when most people are content with a Facebook page, Instagram account, or a blog? I don’t need (nor care) to deal with SSH certificates, and make sure the version of Php I’m running is up to date. It’s simply too far down in the fabled stack for a casual user to need to worry about. But if not knowing those things means some Ukrainian hacker can shut me down, then I’m either going to throw in the towel and join the loathed world of Facebook, or find a middle-ground solution. Hence I’m back in  the saddle and blogging and not dicking around with FTP clients and cpanel anymore.

The solution was to leave my old service provider, move the domain name to Google so I could keep my email address, and then map the blog to — the service provided by WordPress’ corporate parent Automattic. Now I have two-factor authentication, protection from a security service called “Vault Press,” and a managed provider which will guarantee the latest versions are always in place and any security patches applied without me needing to take action.

Why am I not blogging with Drupal on the Acquia platform? That’s next. One step at a time. When one has 6000+ blog posts extending back to 2001, the first priority is to save that body of work and  only then consider something as dramatic as a new blog system. Stay tuned, this transition needed me to have a couple weeks off to get accomplished. A Drupal build will probably have to wait until the Christmas holidays.

I have worked with Drupal before, beginning back in 2005 when I was at IDG and needed to build a site for an advertiser at That was Drupal 5 — now Drupal is on a fresh new version, Drupal 8 — and I want to learn the latest.


On Straight Talk

Written on vacation, while deep in the final volume of Manchester’s biography of Churchill, a master of the English language who despised doublespeak and verbosity:

During my five years with Lenovo, the personal computer company born out of the acquisition of IBM’s PC division by the Chinese computer company, Legend, the experienced IBM executives who came over with the acquisition used to engage in something they called “straight talk.” I found this term a bit off-putting because it made me question what I was hearing the rest of the time, but it mainly referred to a mano a mano conversation in which one person told another person some blunt truth in unvarnished terms with some scatological obscenities mixed in to underscore the point. The official term for straight talk is, I suppose,  “Plain English” and while I’m sure anyone would agree simple-and-direct beats jargon and clichés, it amazes me how quickly we all lapse into wordiness and meaningless pomposity.

In corporate communications, indeed in any organizational vocabulary from governments to religions, the insidious creeping effects of bureaucratic doublespeak inevitably begin to infest the words and messages of the institution.  George Orwell wrote the definitive essay on this phenomenon in his 1946 piece, Politics and the English Language.  Dickens satirized it in Little Dorritt with the invention of the “Circumlocution Office.” For centuries, indeed as far back as Geoffrey Chaucer in the 14th century, guardians of the language have railed against its pollution by double-speaking, pedantic bloviators who refuse to follow the canon of simple, clear communications.

As a corporate communicator, a so-called “content marketer,” I have a conflicted view of the disease as both a carrier and critic. Without casting stones inside my own house, let me just say that I fight a constant, daily war against the forces of derivative babble-speak, and think, after over a decade within the walls of corporate communications (after two spent in journalism), that I understand the source of the pestilence.

Its name is Google Search.

In technology marketing, language is defined by a three-way symbiotic relationship between the technology press (who are on the wane and not nearly as influential as they were in the 1980s), technology analysts (who are like the press in many regards, but carry the responsibility for creating the taxonomy of categories that define markets, such as “Marketing automation” or “Platform as a Service” or “Web Content Management.” Because technology and governments need acronyms to survive, these analyst categories lead to “WCM” and “PaaS” – and if the press adopt them, which they often do unwittingly, then two legs of the three-way relationship are set and it is only left to the corporate side to adopt them and try to define themselves within those terms.

The analysts rank the companies within a market category, issuing reports (sold at high cost to corporate subscribers) which are used by customers to select the technology that best suits their needs.

So where does Google come in? Simple. The best book on the topic remains John Batelle’s 2005 definitive work: “The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture” It’s essential reading for anyone in digital marketing, but it does the best job of explaining the impact Google had on the language by using the citation system of academic journals to make a value-judgment of which links it returned on a search would be ranked first.  This gave us the sordid world of search engine optimization and search engine marketing, and before we knew it words and links had been, to steal a phrase from Doc Searls and David Weinberger, “weaponized.

Now, in corporate communications, when you put pen to paper so to speak, you need to wonder “how will this rank in search?” If a competitor seems to be doing well with “The Internet of Things” or “Big Data” and making claims to analysts and the press as well as on its own website and ebooks that its products are the best for “IoT” or “Big Data,” well then by golly why not us?

Technology doublespeak moves quickly and has no pride, more quickly than teenagers inventing new slang like “420” or “Netflix and chill.” If some “thought leader” says a startup needs to “pivot” to meet new opportunities and become “agile” then suddenly LinkedIn and Twitter are awash in other wannabe thought leaders jumping on the “agile pivot” bus. The end result is perfectly good words –words Orwell or Strunk & White or Ernest Gowers would approve of – suddenly start to get worn out like old coins that have had their embossing erased by so many fingers over time. We know that old dime means “ten cents” but poor old FDR is just a ghost and the date is barely legible anymore as we hand it over for a stick of gum.

The old Dudley Moore movie Crazy People is about an advertising executive who suffers a nervous breakdown and winds up in an asylum. This scenario leads to a hysterical extreme of “straight talk” taken too far. Moore, unable to let go of his workaholic ways, enlists his fellow patients to help him develop some campaign concepts. This yielded the memorable copy line: “Metamucil: It Helps You Go to the Toilet. If you don’t use it, you’ll get cancer and die.”

I’m not railing against the corruption of corporate communications and public relations. No one wakes up in the morning thinking, “I think I need the Freedom to Innovate” but yet we can easily toss that phrase it into some boilerplate and move on, having checked off some of the magic buzzword bingo words we want Google to rank us on. I’m not proposing we all move to some Hemingway-esque model of short declarative sentences with short declarative words. But I do believe that if a message is ever to truly standout, then it needs to leave the pack and be scrutinized for any bombastic, verbose, sesquipedalian tendencies to use the incomprehensible to blow a smelly fart of vaporware and its friend, fear-uncertainty-and doubt over the poor reader.

The challenge isn’t knowing how to write, it’s persuading other people that good writing is better than bullshit.  All I can say is good luck. It’s a lonely place to be inside any organization in love with the smell of its own verbal farts and try to open a window to clear the air. It can be done, but it is a constant battle, always rekindled as new carriers of the disease float in from some other plague town, spouting their theories of “lovable marketing content” that is “engaging, authentic” and spread like dandelions over the fields of social media. Just stick to your guns, remember the admonition printed on every reporter’s notebook: “Accuracy, Brevity, Clarity” and take heart that great men like Churchill railed against the insidious, pernicious infection as loudly as the greatest writers ever known. Heck, Churchill won a Nobel Prize for Literature, so I take that back. The man saved England with the language of Shakespeare, Gibbon and Kipling.

Let me conclude with George Orwell:

“A man may take to drink because he feels himself to be a failure, and then fail all the more completely because he drinks. It is rather the same thing that is happening to the English language. It becomes ugly and inaccurate because our thoughts are foolish, but the slovenliness of our language makes it easier for us to have foolish thoughts. The point is that the process is reversible. Modern English, especially written English, is full of bad habits which spread by imitation and which can be avoided if one is willing to take the necessary trouble. If one gets rid of these habits one can think more clearly, and to think clearly is a necessary first step toward political regeneration: so that the fight against bad English is not frivolous and is not the exclusive concern of professional writers.”

And for you lost souls toiling in the coalmines of jargon, a brief reading list:

%d bloggers like this: