Ubuntu SSH Client: Too Many Authentication Attempts

I’ve been using a “config” file located at ~/.ssh/config to list out the identities of the various SSH hosts that I connect to on a regular basis. It was successfully preventing me from having to keep track of the usernames for the various accounts I was connecting to on the servers, but when I got to a certain number of entries in the file, I started getting this error:

Received disconnect from *HOST*: 2: Too many authentication failures for *USERNAME*

I Googled around and tried various solutions, including using ssh-add and had limited success, but running an ssh -v hostentry command for a given connection (the -v puts the command in verbose mode) allowed me to see that my machine was still offering up multiple keys.

This seems counter-intuitive to me. The whole point of using the config file is to tell it which key to use, right? Why should I even need to add the identity to the SSH agent? And I wasn’t about to increase the number of retries on the servers. That seems like a recipe for disaster. I should only need one try because I have the right key sitting here!

I finally ran the right Google search and discovered this SuperUser (StackOverflow) question, which had the missing component I needed in one of its answers.

The critical element in the config file that forces the SSH client to use only the key specified is this line:

    IdentitiesOnly yes

Adding that to each of the entries in the config file (immediately below the “IdentifyFile” declaration) did the trick.

So now a typical entry in my config file looks something like this:

Host myshortcut
  HostName somedomain.com
  user someuser
  IdentityFile ~/.ssh/somekey_rsa
  IdentitiesOnly yes

I hope this helps someone!

Comcast Injecting Code into TLS-Encrypted Pages

A few days ago, I was sharing my screen from my laptop with a colleague, when up popped a notification informing me that Comcast (through its Xfinity brand) was no longer supporting my equipment and instructing me that it had to be replaced.

Now before you jump to any conclusions, let me clarify the exact conditions under which this notification appeared:

  • The notification appeared in a single browser tab
  • The browser was the latest version of Firefox, running on Ubuntu 18.04.1
  • The page was from a server under my control, which my colleague and I were working on at the time. It was completely free of any foreign JavaScript that would have allowed or facilitated a Comcast code injection
  • The connection was encrypted via TLS 1.2 with a secure cipher
  • The TLS certificate on the server side was a current one issued by Lets Encrypt
  • Firefox gave the connection a “green padlock” with no mixed content warnings, as in this screenshot (except that I didn’t examine the cert in my browser at the time like I did when capturing this graphic):
This screenshot was captured at a different time, but is representative of what I saw on the page that Comcast injected its code into recently. The difference is that in this screenshot, I was connected to the same site through a VPN.

Authentic Comcast Notification

I didn’t have the presence of mind to screenshot everything at the time the notification appeared. I wish I had. The fact is that I was deep into a lengthy session with my co-worker, and we were focused on the project at hand.

Also: I was flabbergasted.

Comcast is notorious for injecting code into subscribers’ browsers, most notably to track browsing behavior, display its own ads, and warn users about data caps. But that sort of activity is precisely why the EFF and others have been promoting HTTPS everywhere. TLS encryption should prevent what is essentially a “man in the middle attack” by ISPs like Comcast against its own customers.

None of this was news to me.

And while I didn’t screenshot the “end of life notification” regarding the Comcast-issued on-premise equipment, I did click through from the notification.

In a new tab, I was shown the exact model numbers of the equipment for which Comcast was ending support (i.e. only the model numbers in use at my location, and in the correct quantity), and was taken through a simple “wizard” which offered to send me the replacement equipment in a self-install kit. Although I clicked through to see exactly what it would do, I did not submit the form, instead choosing to click a link which opened this support article about “end of life” equipment in a new browser tab.

Had I not been under time pressure related to the project we were working on (and had I not also been bearing the cost of the person’s time who was on the other end of the active screen-sharing session), I might have taken the time to investigate further. Regrettably, I did not do this. It surely would have netted me the specific lines of code injected into my browser on top of the screenshots which I should have captured, but didn’t.

The bottom line, however, is that the message was authentic. The equipment in question was correctly beyond its end of life, and the notification was being served from Comcast.

Also: my machine is clean. I run very few browser extensions. None of them are known to have any vulnerabilities. And my machine is—by every indication, including clean scans—free of malware.

Exit Xfinity

Rather than requesting a self-install kit with replacement equipment, I packed up the end-of-life gear and took it to a local Xfinity store, cancelling my service.

I’ve had television service with Comcast/Xfinity for the better part of the last decade, and internet service for many years longer.

But I would have cancelled the account much sooner were it not for the utter lack of competition in my area. Broadband service can essentially only be had from 2 vendors at the location in question. They both gouge customers with excessively high fees, exploiting the unfair economics of the situation.

But this incident was the last straw.

Comcast had violated me one too many times. Hopefully they won’t turn out to have been the lesser of two evils. Time will tell. In the meantime, something must be done to level the playing field so that ISPs who don’t egregiously abuse the privacy of their subscribers can compete effectively.

Inline Code in DokuWiki

Call me picky, but I didn’t want to be forced to use a code block every time I wanted to include a little bit of code. Sometimes, it’s just handier for the code to be inline so it doesn’t disrupt your text, but still clearly looks like code.

So today, I’d had one too many occasions to try to make it work (unsuccessfully), and I dug around and found this answer on StackOverflow. It took me a moment, to realize that it was from Andreas Gohr, the creator of  DokuWiki.

In short, just enclose your text in ''%% and %%'' like this:

Here's a sentence I'm typing and ''%%this is code%%'' I want to include inline.

This works beautifully!

Ironically, it’s not shown in the official syntax documentation, except that if you view the source for that documentation, you realize this technique is actually used to put inline code throughout the syntax documentation.  😉

(Now why didn’t I notice that and think to look at the source code? Sheesh.)

How to Recover a Lost Draft in DokuWiki

TL;DR grep your filesystem for a unique fragment of text that’s likely to only appear in the content you lost when your draft disappeared. Step-by-step instructions here.

Not long ago, we started using DokuWiki as an internal solution for documenting technical details, systems, and best practices in our digital marketing agency. Let me just say that I love the software. It’s easy to install and configure, training users on it is relatively painless, and its simplicity makes it an amazing solution for purposes like ours.

But… like any new system, getting accustomed to its quirks can take some time—especially quirks you don’t run into very often.

Today, I was working on a lengthy new page in DokuWiki and I got busy researching something in another browser tab (or 10). Naturally, I hadn’t hit the “Preview” button, nor had I saved a version.

You can probably guess where this is headed.

I returned to the browser tab where I had DokuWiki open and found the dreaded “editing lock expired” message.

Normally, this wouldn’t be a big deal. We aren’t typically handling lots of concurrent users, so often only one of us is doing any editing at one time, much less the same page. And I’ve found that just by clicking one of the options, I can usually get right back to the editor.

But this was a brand new page that hadn’t been saved yet.

And, being in a hurry, I just started clicking buttons and not paying attention to what I was doing. The next thing I knew, I was looking at an empty editing window..

And this was after spending more than an hour working on the content for the page. It was gone. All of it.

The one thing I had going for me is that I had noticed a “draft autosave” message in the browser at one point. So, I went looking to see if I could find the draft.

Where DokuWiki Stores Drafts

If there had been a saved draft, DokuWiki would have shown it to me when I visited the “edit” screen for that page again. But I didn’t get a message about an existing draft. Also, the “Old Revisions” tab for the page was empty. This made me suspect that my draft had been lost.

So… I connected to the server (via SSH) where the instance of DokuWiki was running and started looking around.

After some Googling, I found that by default, DokuWiki drafts are automatically saved in the /data/cache folder, sorted into numbered subfolders.

Issuing the ls -lt command, I could see which subfolders were the most recent ones, and I looked through them. There were no files with a .draft extension, which explained why DokuWiki hadn’t shown me a draft for my page when I re-opened the editor.

But since I knew I had seen the “draft autosave” message previously, I knew there had been a .draft file at one point. Given that the file no longer existed, surely it had been deleted!

Well that’s great… we can undelete files, right?

Not so fast. This particular server is a VPS instance at Digital Ocean that we use for intranet purposes. Being that it’s a VPS, the typical data recovery tools for Linux like TestDisk and foremost aren’t much help. Virtualized disks means virtualized storage… or something. I’m out of my depths here.

Let’s just say that I tried both of them and didn’t get the result I was hoping for.

Recovering Text Files in Linux

Since DokuWiki stores content in text files on the server, it occurred to me that I should look specifically for a means of recovering .txt files (not even one of the available options in foremost, which has command line options for various file types).

A found a tidbit on recovering deleted plain text files in Linux that gave me some hope. And after just a couple of minutes, I found the entire contents of the last “draft” of my DokuWiki page. Here’s exactly how I did it.

Steps to Recover a Deleted DokuWiki Draft in Linux

  1. Browse the filesystem on the server where your DokuWiki installation is located. In my case, I used ssh to connect to our intranet server in a terminal window.
  2. Determine where the partition containing your filesystem is “mounted” in Linux. From my terminal window, I ran the mount command (on the server, of course) to display a list of mounted filesystems (details on the mount command here). Just running the command by itself with no command line options will display the full list. It’s a lengthy, hairy mess.

    On a normal Linux workstation (non-virtualized), you’d typically be looking for something like /dev/sda1 or /dev/sdb2. On the Digital Ocean VPS, I spotted a line that began with /dev/vda1 on / type ext4. I decided to give that a try.
  3. Next, you’ll need to recall a bit of text from the page you were writing when your draft got lost. The more unique, the better. Also, the longer, the better.

    The command we’re going to run is going to look for bits of text and then kick out the results from its search into a file you can look through. If you use a short or common string of text in the search, then you’ll get a huge file full of useless results (kinda like running a Google search for a common word like “the” would produce).

    In my case, I’d been working on some technical documentation that had a very specific file path in it. So I used that as my search string.
  4. Run the command below, substituting your unique phrase for ‘Unique string in text file’ (be sure to wrap your text in single quotes, though) and your filesystem location for /dev/vda1
    grep -a -C 200 -F 'Unique string in text file' /dev/vda1 > OutputFile
  5. Wait a few minutes. In my case, the grep command exhausted the available memory before too long and exited.
  6. Look through the file that got created. You could use a command like cat OutputFile or, as long as the file isn’t too huge, you could even open the file in an editor like nano by using nano OutputFile. The advantage to the latter method is that you can then use CTRL+W to search through the file.

    On my first attempt, I used a shorter, more common phrase and got an enormous file that was utterly useless. When I gave it some thought and remembered a longer, more unique phrase, the resulting file from the second attempt was much smaller and easier to work with. I found several revisions of my draft, and that gave me options to work with. I decided which was the most complete (recent) and went with it.
  7. Copy the text. You can then paste it somewhere to hold onto it, or just put it right back in DokuWiki. Just be sure you hit “Preview” or “Save” your page this time around.

One quick note: I’m not sure if it was necessary or not, but I actually ran the commands above as “root” by running sudo -i first. I haven’t tested it, but this may actually be a requirement. You might also just be able to preface the commands with a sudo (e.g. sudo grep -a -C 200 -F 'Unique string in text file' /dev/vda1 > OutputFile ). For either of these to work, you’ll obviously need to have an account that has the ability to run sudo.

I hope you find this useful! If so, I’d love to hear about it. Also: if you have questions or problems, you’re welcome to leave those in the comments as well. If I can help, I will gladly do so!

A Prayer

When I was in high school, a youth choir I participated in did a piece of music that was, in many ways, quite transcendent.

It comes to mind from time to time. Today, as I was thinking about the very idea of prayer and the many ways that the concept has evolved in my thinking over the years, it came up again.

Lord, make me an instrument of Thy peace
Where there is hatred, let me sow love
Where there is injury, pardon
Where there is doubt, faith
Wherever there's despair, hope
Where there's darkness, light
Where there is sadness, joy.

Lord, make me an instrument of Thy peace

Oh Lord, grant that I may not so much seek
To be consoled as to console
To be understood as to understand
To be loved as to love

Lord, make me an instrument of Thy peace

For it is in giving, that we receive
For it is by faith that we believe
For in forgiving we are forgiven
It is in dying that we are born to eternal life

Make me an instrument of Thy peace

The arrangement we sang was this one, set to music by Mary McDonald. She attributed the lyrics—as many do—to St. Francis of Assisi, but there appears to be little evidence that they are any older than the early 20th Century.

Regardless of their origins, these words are resonating with me today. I hope they provide something meaningful to you.

Zapier: Add a Timestamp to any Zap

So you’re building a Zap and you need to add a timestamp to an Action, but your Trigger doesn’t include a timestamp. You’re in luck! Zapier has a handy-dandy built-in method for doing this:

{{zap_meta_human_now}}

For example, I just connected up a Webhooks subscription using Zapier’s built-in “Catch Hooks” Trigger.

The REST API I connected to it supplies several data fields, but there’s no record of when the Webhook fires (there’s no date, time, or timestamp in the incoming webhook).

As I had already connected the Trigger to a Google Sheets action (“create a new row”), it occurred to me that there should be a way to just add a timestamp column in Google Sheets and then automate a timestamp when the new row gets created.

Enter Zapier’s fantastic documentation. I thought I might have to use one of Zapier’s nifty built-in “Filter” apps, but actually, all it required was copying and pasting the value above into the Zap Action’s field that correlates to the new column I added in Google Sheets!

Zapier documented this in the Advanced Tips –> Modifying Dates & Times section of their documentation.

As soon as you paste `{{zap_meta_human_now}}` into the field in your Zap, Zapier transforms it like this:

Zapier autogenerates a "Now" timestamp for you!

This automagically showed up in Google Sheets like this:

Google Sheets column with Zapier Timestamp in it
Zapier Timestamp Added to Google Sheets

Time Zone Considerations

One “gotcha” that didn’t occur to me until this Zap started running: I didn’t have a timezone preference set in my Zapier account, so it defaulted to UTC time. Thankfully, this is configurable!

Not only is the time zone in your Zapier account configurable, but you can use multiple time zones in the same account if you need to by using a more specific “timestamp” formula. For example:

{{zap_meta_edt_iso}}

This will spit out a “machine-readable” (ISO 8601) timestamp, including the offset for the timezone you choose. The example above uses edt for “Eastern Daylight Time.” This allows for a little more clarity so that you don’t have a situation where the app receiving your data also offsets for timezone, which would goof up the accuracy of your time (and date).

All the options are at the link reference above.

Reason #537 to Love Zapier

This is a small thing, but it’s a big deal in a situation like the one I was in today. One more reason to love Zapier!

Lenovo USA Customer Service Saves the Day

Lenovo Customer Service Saves the Day

A few weeks ago, I did something I'd been planning to do for a long time: I ordered a Lenovo ThinkPad.

Having looked forward to this for years, I was refreshing the UPS tracking URL like a fiend, and I knew the moment it arrived. Imagine my shock when I opened the box and discovered that a completely different machine than the one I ordered was in the box.

What followed can only be described as a sequence of customer service SNAFUs that I can only hope were accidents.

Regardless, I'm happy to report that as of today, Lenovo USA has gone above and beyond and made everything right.

Some Important Thanks

There were a few key players who got involved when this situation was rapidly devolving into a disaster:

  • Kayle, who answered one of my phone calls and then really took ownership of the issue until it was resolved
  • Erica, who was one of the helpful folks staffing the @lenovohelp Twitter account, and who pushed a case through the necessary escalation to get it to Tonya
  • Tonya, who called me back in response to the case that was created after my Twitter outreach, and who emailed me her direct contact info in case Kayle's efforts were unsuccessful.

Perhaps the biggest hero of this whole story is a guy from Pennsylvania named Cameron. When I reached out to him because I'd received the machine he ordered, he responded and agreed to ship me my machine (which thankfully he had received) while I shipped his to him.

A Dream Machine

My Lenovo ThinkPad T570 is truly a fantastic piece of hardware. It lives up to the longstanding reputation of the ThinkPad family of laptops going back to the IBM days.

To give you some context: I hate spending money on laptops. There are a number of reasons for this, but the most significant one is probably the sheer amount of abuse that gets dealt out to any laptops I own. With frequent travel for consulting and speaking, my devices get a lot of miles.

Consequently, years ago, I adopted a policy that I would buy the cheapest possible machines that I could find. This approach served me well. My last machine lasted me for over 6 years (a record by at least a factor of 2), and I originally paid less than $500 for it (including tax) at a local Best Buy.

Because I'm a geek and I like to tinker (another reason I don't enjoy laptops as much as the good old days of desktops that you could take apart and upgrade), I'd done a number of things to my last machine, including:

Like I said… I've been cheap where laptops were concerned.

But that last machine (a Gateway), was seriously on its last leg. And part of the reason I kept trying to stretch out its life was because I was avoiding Windows 10 at all cost, and was grateful that Windows 7 was still serving me reasonably well.

Since I'd proven I could make a cheap machine last so long, I decided to re-think my strategy a bit. What could I do with a high quality piece of hardware that was spec'd out with a super-fast SSD, a top-of-the-line processor, a decent GPU, and tons of RAM?

For high quality hardware in the laptop space, there's nothing better than Lenovo's ThinkPad line. And they tend to be built to be taken apart and upgraded, which adds an enormous benefit to me personally.

So I started watching the Lenovo Outlet (yes, even when I'm making a bigger investment, I can be a little cheap) for a ThinkPad with at least 32GB of RAM, an Intel I7 CPU, and a decent NVIDIA GPU video adapter. When I found this ThinkPad T570 that I'm writing this blog post on right now, I was elated. It was a great price, and had everything I was looking for (except the SSD capacity, which I planned to fix by adding another SSD).

Perhaps now you can see why I was so utterly disappointed when I got a completely different machine that didn't have anywhere near the specs that I had paid for.

I immediately opened a chat session with a support person on the Lenovo Outlet website, got lots of assurances, but ultimately no help. After a few days of giving plenty of time for people to work and swapping emails, I took to Twitter:

They responded, but initially they just investigated my existing case and weren't able to improve the situation.

A few days later, I was tired of waiting, so I placed a phone call and expressed significant displeasure with the entire support experience. I kept poor Kayle on the line for far longer than she wanted to be, while I tried to urge her to do the right thing for me.

Not confident that that would turn out well, I took to Twitter again:

That resulted in a call from Tonya, who checked into what was being done for me, and who also invited me to contact her directly if things didn't turn out well.

Ultimately, Lenovo paid for the shipping cost I incurred when I sent the laptop I initially received to its rightful owner. I'm sure that had my initial contacts with their support people worked out, they would have arranged for that to occur, but they were simply not responding fast enough nor appropriately.

It was an unusual situation, to be sure. When I compared the order numbers for my order and that of Pennsylvania Cameron's, they looked similar. It would be easy if you were working in a Lenovo warehouse full of nearly identical boxes to stick the wrong shipping labels on two boxes. Thankfully, the person who made this simple mistake managed to swap the shipping labels. It would have been a real disaster if multiple orders/shipments were affected.

In any event, I couldn't imagine having Pennsylvania Cameron ship my long-awaited machine back to Lenovo's warehouse, letting them sort out what happened, and then almost certainly be unable to ship it to me for one reason or another. (In fact, the first thing the customer support person I initially contacted via chat wanted to do was cancel and refund my order. I didn't want my money back, dangit. I wanted the machine I had watched the Lenovo Outlet for!)

In any event, I've managed to write one of my longest blog posts in recent history about a customer service issue. I'm going to wrap it up now by saying this:

Even though I initially had a poor experience, Lenovo USA has truly won my trust and turned a negative situation into a perfectly acceptable one.

Thank you, Lenovo. I love my ThinkPad.

FCC Comments “Researcher” Abusing Email Addresses Required to Be Public Record

Did you submit this comment to the FCC?

Today, I received this email an email purporting to be from someone named Courtney Pruitt, whose email address was courtney@fcccommentsresearch.org, with the subject line:

Did you submit this comment to the FCC?

The email is the first that I’m aware of receiving in connection with my public comments to the FCC on the net neutrality issue.

My first thought was, “Hey! The FCC hired a research firm to verify which comments were from legitimate senders by emailing the commenters and asking them to verify their comment.”

On closer inspection, however, there is nothing official about this email at all. In fact, there are some suspicious details:

  • The domain name, fcccommentsresearch.org, was registered via Namecheap on August 9, 2017.
  • The domain name was registered with privacy turned on, which means that the public WHOIS shows proxy information rather than the details of the actual registrant.
  • The statement, “We are investigating comments submitted to the FCC website on a public filing about net neutrality.” could be true of the email’s sender. It doesn’t actually even imply that the FCC has anything to do with the research nor the sending of the email message I received.
  • This statement, “Responding to this email with help us verify real comments so that we can discover how the public truly feels about net neutrality.” is even more telling. Whoever “we” refers to, “they” want to know how the public truly feels.
  • The message was sent via Mandrill, a bulk email service provider owned The Rocket Science Group, parent company of MailChimp.

The Mandrill platform allows for open tracking, and although I haven’t actually done a thorough analysis, I suspect this message contains a beacon for the purposes of open tracking. Thus, the sender knows that I’ve looked at the message.

I did not, however, click any links, nor did I take the action requested, which was to reply with a “yes” or “no” to the question of whether or not I actually submitted the comment that they quoted.

The message began:

Dear David Johnson,

According to the FCC website, you wrote a comment on 2017-07-14 10:37:46 about Net Neutrality.

Could you confirm that you submitted it by replying “Yes” to this email? If you did not submit this comment, please reply “No.”

The text of the message attributed to you is:

I’m redacting my comment here, although the email message I received did appear to have something I wrote (I didn’t actually check it thoroughly).

Then the email was signed as follows:

Thank you,

Courtney Pruitt
Data Analyst with Ragtag

Why you are receiving this email:

We are investigating comments submitted to the FCC website on a public filing about net neutrality.
Your name/email is attached to a duplicate comment, and we just want to make sure it was you who submitted it. You can see the comment on the FCC website here: https://www.fcc.gov/ecfs/filing/XXXXXXXXXXXXXXX [redacted]
Responding to this email with help us verify real comments so that we can discover how the public truly feels about net neutrality.

refid:XXXXXXXXXXX [redacted]

The refid at the bottom of the message is presumably a unique identifier which could be used by the receiving system to automate the analysis of replies.

I’ve always been mildly alarmed by the need to put my email address into information that will be displayed, unredacted, to the public. This makes my information susceptible to scraping. Whether or not this is actually a good idea is open to debate, as far as I’m concerned.

But since this is the first time that I’m actually becoming aware that my publicly-displayed email address has actually been used to contact me (and I’m only aware of this occurrence because it makes reference to my actual comment), this is my chance to voice my concern publicly.

Who is it that has scraped my comment and my email address? 

Why do they take significant steps to mask their identity?
No Google search for “Ragtag” with phrases including “data analyst,” “data analysis,” or even “Courtney Pruitt” turned up anything useful. And I refuse to click the link for the word “Ragtag.” It’s destination URL is being masked by the Mandrill bulk email process in order to allow for click tracking by the sender.

Who has the resources to pay for this kind of research?
My concern on the net neutrality issue is that the big ISPs like Comcast, Verizon, etc. are the ones who would conduct research like this so that they could manipulate the outcomes of the research. Am I jaded and cynical? Probably.

If you’re a legitimate researcher, why not be more open about this research?
Will the data be shared publicly? If so, by whom? When? Where?

I don’t normally take the time to write about phishing emails. Maybe this is legitimate research, and maybe it isn’t.

Whatever it is, it rubbed me the wrong way.

 

Why Even Conservatives Want to Save the Celery Fields

Why Even Conservatives Want to Save the Celery Fields in Sarasota

The past seven or eight months have been quite a learning experience for me.

We hadn’t made it far into 2017 when I first heard of the plan to construct what amounts to a dump at the Celery Fields in Sarasota.

Now I am not what you would call an “activist.” Prior to 2017, you could count one hand the number of times you’d have spotted me holding a sign in a crowd. Some people enjoy it and would find a reason to do it every week if they could. I just don’t happen to be one of them.

Nor am I what you would call an “environmentalist.” This issue has become one of the growing number of issues for which I consider both sides of the argument to be less than honest and certainly not aboveboard.

Nor am I someone who believes in giving government even more control over our private lives. I regard the increasing incursion of the State into our individual liberties as a dangerous menace—one that was foreseen by the Founders and Framers, whose counsel on the issue we ignore to our own peril.

In other words, I’m not what you would call a “liberal,” at least not according to any modern definition of the word.

And yet, in 2017, I’ve signed petitions, gone to rallies, attended public hearings, and spoken out in a number of ways to ask Sarasota County to deny petitions to build on Celery Fields lands.

Knowing this, many casual observers might assume that they could accurately guess my political leanings.

And yet, over the last number of months, I’ve worked alongside card-carrying members of both of the major political parties, and many others who are harder to classify, politically speaking. Lifelong Republicans, tree-hugging environmentalist Democrats, frustrated independents, and members of other parties have aligned themselves to oppose these projects, most notably the one proposed by local developer turned public official, James Gabbert.

And this is how I have learned so much this year. Never before have I seen so many people willing to lay aside their cherished ideologies and work together with people who, in other circumstances, they’d probably vehemently oppose.

There’s an old saying about nothing being stronger than the heart of a volunteer, and that is what I’ve observed—and been humbled by—throughout this process.

I’ve watched people give up hundreds of hours of their time, sacrifice business opportunities, and risk embarrassment (or worse) to protect the Celery Fields. Business owners, former journalists, retirees, birders and wildlife enthusiasts of all ages, and—yes—people whose interest in activism has risen to levels deserving of the word, “professional,” have linked up with one another in a shocking display of heart.

Why?

It can only be because they care.

They care about beauty. They care about nature. They care about what sort of society we are creating. They care about what we do with Publicly-owned land. They care about our water supply and the health of estuaries and aquatic wildlife. They care about the birds and the beautiful habitat that’s been created over the last twenty-plus years, by accident or not. They care about preserving the peaceful serenity of the Palmer corridor, which only looks “industrial” on maps created during the Reagan Administration. They care about the two thousand homes—soon to exceed 2,600 with new developments going in—and the neighborhoods that have grown up around them. They care about Tatum Ridge Elementary School and the 700 students, not to mention the hardworking faculty and staff, just down the street.

And it’s been my honor to work alongside such caring people. Maybe in other circumstances and on other questions, the things we care about would find us disagreeing.

But after months of working alongside people of this quality, I must say I’m more inclined than ever to really listen and to try to understand where they’re coming from and to see if perhaps there aren’t better ways to solve our problems than the bitter public thrashings that seem to be the order of the day.

In other words, I care too—about the Celery Fields, sure—but more importantly, about the people I’ve come to know through this unique experience.

Tomorrow, we’ll hear from the County Commissioners. That means today we’re all busy making our final preparations. Regardless of the outcome, I want all of you to know how much I have grown to deeply respect and appreciate you. Thank you for the honor of working with you.

Congratulations to Jo Hagan, CPA

A huge congratulations to our longtime accountant, CPA, and all-around business adviser, Jo Hagan, on the rebranding of her accounting practice from Hagan CPA, Inc to Barefoot Accounting, PA.

In the process of re-branding, Jo recently launched an all new Barefoot Accounting website.

Jo and I met at a networking event in 2001, not long after 9/11. I had just gone full-time in my business, and she had decided she wanted to sacrifice some of the security and stability of a corporate job in favor of working from home and being there for her two young daughters.

One of the benefits of working from home, of course, is that you can go barefoot if you’d like! Today, Jo’s daughters are off in college, and Jo is taking advantage of the opportunity to help others like her who might be Moms or would otherwise prefer to work from home.

Jo’s new brand represents a fantastic new direction as she helps other CPAs and bookkeepers develop a stable livelihood while simultaneously refocusing her own efforts on providing top-shelf business advisory services that go way beyond filing taxes and producing reports. We couldn’t be happier for Jo!