Weekend Project: Using a Nest Hub Max to Access GSuite + Control a Nest Home

One of my long weekend projects was to be able to use a Nest Hub Max to access my GSuite work calendar and reminders as well as control my smart home utilizing my personal GMail and Nest Account. Google has said that this is supported for a while, but it took quite a bit of trial and error to actually figure out how to make it work.

Note: For this to work you must be on the same Wifi network for all of the steps.

Step 1 – Get the Nest Hub Max Working on GSuite

From an out of the box/factory reset perspective, the first step is to set the Nest Hub Max up with your GSuite work account. I opted in to enable the device preview program. I also configured a Voice Match and Face Match and configured personal results to always show proactively. This allowed my work calendar to pop up reminders and show upcoming meetings proactively and that I could join Google Meet by tapping on the display.

Step 2 – Unlink But Do Not Reset

The next step is somewhat orthogonal. Go into the Google Home application and select the Nest Hub Max. Go into Settings. Then select Remove Device. This will remove the device from your GSuite account’s Google Home so that you can add it to your personal account’s Google Home where your Nest devices are installed.

Step 3 – Adopt the Device Into Your Personal Google Home

The next step is to switch accounts in the Google Home application to choose your personal account. You should see the Nest Hub Max as a local device; select it, go into Settings, and add it to a room. This will add it to your household.

Step 4 – Configure the Assistant Settings

In your personal account’s Google Assistant settings, remove voice and face match from any of the Nest Hub Maxes that you wish to use with your work account. This will ensure that there is no confusion.

Then, switch accounts to your GSuite account.

Go down to Voice Match, select Add a Device, and select the Nest Hub Max. This will get your GSuite account linked to the Nest Hub Max.

Then, while remaining signed into your GSuite account in Assistant Settings, go into Face Match. Hit the “+” icon to add a device, and then select the Nest Hub Max. Then you can configure Personal Results under Devices in Assistant Settings while still signed into your GSuite account to show proactively or show upon face match (which is what I have chosen).

Then, whenever you are in front of your Nest Hub Max, you have access to your work calendar AND can access all of your Nest Devices and other Google Home functionality under Home Control.

This is the best of both worlds, but it unfortunately took a lot of research, trial, and error to figure it out. Hope this helps!

ARM & Apple Silicon: The Future of Desktop Processors

I am in day two of living the Apple Silicon life. My 13″ MacBook Pro with 16GB and 2TB arrived yesterday. My initial impressions are that ARM and Apple Silicon (by proxy) are indeed the future of desktop processors. And once again we see a classic case of Microsoft having the right idea first, but Apple being better at execution.

Echoes of Windows 200x and the DEC Alpha

In 1998 I was working on what at the time was labeled Windows NT 5.0 Beta. I was working simultaneously on Pentium II/III x86 systems and DEC Alpha systems. It was clear from the performance that 64-bit was the future; not being bound by 4GB of RAM the DEC Alpha’s blew away every performance and stress test we could throw at them. Only one small problem – nothing ran on them. Nobody ever bothered porting anything and the Alpha and other 64-bit processors and Windows died a sad but quiet death. The lesson: compatibility matters.

Windows Server 2003 and Windows XP x64

Fast forward five years and I was a Group Program (Product) Manager on Commerce Server and seconded to the Windows Server System (later Common Engineering Criteria) project as a side hustle to help be part of the cross-company working group to standardize server products at Microsoft. One of the programs I was driving was the adoption of x64 processor architectures and 64-bit Windows. I had an early engineering sample with an AMD x64 processor, whereupon Intel had been scooped for the first time in their history.

All the things I loved about the DEC Alpha were there – blazing fast performance and no memory bounds. But there was one big difference – everything just worked. Windows-On-Windows (or WOW) allowed for perfect compatibility with 32-bit Windows applications. Provided you had 64-bit device drivers, application compatibility was a non-issue.

This transition was entirely successful – Microsoft migrated server workloads in ~4 years. Desktops shortly followed suit. And nobody ever looked back.

Ill-fated Attempts at Tablets and Windows-on-ARM

Microsoft invented the Tablet PC form factor. They made for incredible demos, but suffered from lots of practical form-factor-related gotchas and terrible battery life. Even the most die hard dog fooders gave up and switched back to regular laptops within a few years. And the form factor died away.

Then, Apple invented the iPad. It worked with all of your favourite iPhone apps. It used the same developer ecosystem as the iPhone. It had an amazing browser. And it just worked and did all the things you expected of a tablet, including 3G connectivity on the go. I stood in line for hours to buy one and never looked back. It totally changed how I consumed content. And the tablet as a mass-market form factor was born. Lesson two: design and user experience matter.

Microsoft with Windows 8 tried the tablet form factor again with the debut of the Surface RT and Surface 2 and never-launched Surface Mini. And so debuted Windows RT, the first incrarnation of running Windows on an ARM processor.

The iPad and Surface RT/2 proved a point – ARM processors were far better suited to a tablet form factor than x86/x64. They had the battery life. There was just one problem with Windows RT; like the past mistakes nothing ran on it. And, like its predecessors – it died away in one of the biggest write-downs in Microsoft’s history.

The Surface 3 and subsequent devices got the user experience of Windows on a tablet right. But, they still were tied to x86/x64 processors and suffered from horrible battery life.

As the years evolved, Apple made the iPad more and more laptop like. And Microsoft didn’t give up on Windows on ARM.

2019 saw the launch of the Surface Pro X (along with several peers) running Windows 10 upon ARM processors. This time, Microsoft partially re-learned the lesson from x64 – 32-bit Windows applications run just fine (for the most part). So if you could live with native applications and 32-bit Windows applications, you would end up with the best Windows experience out there. (I can attest; I have a Surface Pro X and it is by far my favourite on-the-go Windows device I have ever used with substantially better battery life than x86/64.)

Just one small problem – lots of applications moved onto x64 like Adobe Creative Cloud and these do not work (yet) on Windows on ARM. So, for a lot of people, it is just simply not a viable option. Microsoft has commited this for an update that will ship for 2021, but it remains to be seen how well x64 emulation will perform.

Lesson three: make the developer experience really seamless – Apple got this right on the iPad and Microsoft didn’t on Windows on ARM. This is why the iPad ended up with a vibrant set of native applications and partially why there are still very few native Windows-on-ARM applications incliding Office.

Enter Apple Silicon

Apple finally at WWDC 2020 unveiled the notion of Apple Silicon – macOS running on an ARM-derived processor based upon the A-series in use on the iPhones and iPad’s. Rosetta2 debuted, building upon Apple’s experience migrating from Motorola 6800x to PowerPC, and then subsequently from PowerPC to x64.

The M1-based Macintoshes were announced and shipped in November 2020 and I’m sitting here writing this on the highest spec’ed MacBook Pro available. The verdict? I am *beyond* impressed and feeling the same sort of excitement when I first used an AMD x64 engineering sample back-in-the-day.

The developer experience is seamless – Office, Chrome, Fantastical, Twitter/Tweetdeck, and all of the Apple applications are there on day one. Photoshop is in Beta.

With Rosetta2, anything not requiring a kernel extension or direct hardware access (e.g. virtualization) just works. Perfectly.

The performance is incredibly impressive. Anecodotally, it is as responsive as a 16-Core 256GB RAM Mac Pro with an AMD Radeon Pro Vega II processor. It is *MORE* responsive than a 16″ MacBook Pro with 8-Cores, 64GB RAM, and an AMD 5500 GPU. It is cool and silent – unlike the 16″ MacBook Pro which could get uncomfortably warm and was constantly leveraging its fans. Amazing!

Battery life is out of this world – I have been using it with Chrome (and many tabs), Mail, Photos, Messages, Calendar, Contacts, News, Slack, Fantastical, Todoist, Trello, Jira, Miro, Messenger, WordPress, MindManager 13, Word, OneNote, Teams, Skype, Telegram, WordPress, Twitter, Sublime, and Whatsapp Desktop running for 1.5 hours. The screen is on quite high brightness. Files are still synchronizing from the cloud using iCloud, OneDrive, and Dropbox. And my battery is at 95%. UNBELIEVABLE!

I can run a number of iPhone and iPad applications, which is quite novel. The experience is not great. But I expect that will change quickly.

My day two verdict – this is indeed the future.

Apple has pulled it off – again. And, once again, building upon an idea Microsoft kind of had first.

I hope Microsoft continues to invest in Windows 10 on ARM – and even considers partnering with Apple to offer a version of it on the Apple Silicon Macs (according to Craig Federighi in Ars Technica this is entirely possible). If they can eventually address the developer experience and compatibility, the amazing experience that is Apple Silicon could come to Windows (building on what in my opinion is already the best experience – if one can live with the restrictions).

Meanwhile, I’m more optimistic about desktop computing than I have been in a while…

Sitecore Commerce lives (in between many travels)

I realize I’ve been far quieter with respect to blogging than I had intended. As I get ready to head off on another trip (because 101.5K miles flown this year isn’t enough), my long layover at Chicago O’Hare gave me a chance to reflect on just how fast the last few months have gone since Sitecore acquired Commerce Server last year. 

Sitecore Commerce is a comprehensive product portfolio aimed at providing a commerce solution for all enterprise needs. It is much broader in charter and scope than Commerce Server ever was, as what I had been working on constitutes just a component in the overall strategy. But, the sheer amount of work to get integrated and getting this to market has obviously required a considerable amount of time, focus, and cone of corporate silence. 

It’s been a phenomenal journey. And with the product and strategy now in-market, the proverbial cone of silence can be lifted. So expect to see a bit more on here. 

And, for a recap: 

Sitecore® Introduces Industry-First Enterprise .NET Experience Commerce Platform (Press Release)

Sitecore announced an early access program for Sitecore Commerce, powered by Commerce Server, the first enterprise-grade .NET customer experience management solution with fully integrated commerce functionality. The new offering enables marketers and merchandisers to deliver personalized and relevant digital shopping experiences from initial customer acquisition through online transaction using a single, connected platform.

  • Sitecore Commerce Unveiled: The First Enterprise-Grade eCommerce Integrated .NET CXM (CMS Critic)
  • Sitecore Marries .NET Customer Experience, E-Commerce (CMSWire)
  • Sitecore Fully Integrates e-Commerce into Experience Platform (Digital Tech Diary)
  • Sitecore Adds Commerce Solution to Its Experience Platform (DM News)
  • Sitecore Adds Editing and Asset Management Capabilities for E-Commerce Sites (The Hub)
  • Sitecore Premieres Sitecore Commerce (KM World)

For some additional information, check out Mark Floisand’s and my blog posts:

And with that…I have a plane to catch – onto corporate headquarters at Copenhagen for the next two weeks. 

For US Travelers, EMV Can’t Arrive Soon Enough

Most of the rest of the world has had EMV for years. For example, I had it from the time I moved to Canada in Winter 2008. Just what on earth is EMV? It is the chip/PIN support on your MasterCard and Visa credit cards that allow you to make purchases via PIN versus swipe and signature.

Why is this important? Besides the obvious anti-fraud benefits, it is almost mandatory to make even basic purchases when traveling abroad. Consider my case. I’m working out of our Copenhagen office this week. I went to the local supermarket and went to buy things the TSA frowns upon like razor blades and shaving cream – and Kleenex, which apparently my hotel doesn’t believe in.

And I find myself unable to make a purchase using my Visa or my MasterCard. My MasterCard, which supposedly has a PIN, didn’t work either. Try my debit MasterCard with a PIN? Another fail. Try a different store. Same result. My story had a happy ending. Others might not be so lucky.

I try resetting my MasterCard’s PIN over the phone (which was successful) and go back – it still didn’t work. (And this is after two lengthy calls to my bank.) I call Visa – they were able to setup a PIN, but warned me all transactions may be treated as cash advance. Low and behold, I was finally able to make a purchase. Kudos to the United Club Visa Card and excellent customer service from Chase, who provides the card. They truly understand what world travel means.

But, this required a great deal of patience, persistence, and resourcefulness that the average traveler might not possess. If you’re traveling abroad, beware and vet this in advance. Otherwise, when you get a chip and PIN in the next year or so, rejoice.


Converting from Parallels to Hyper-V

You would think converting a virtual machine from one format to another would be simple. Especially when there are multiple blogs out there on how to do it. But, of course not – so time for another misadventure in computing post. I was asked by a colleague to get a copy of a demo virtual machine for some upcoming analyst activities. The problem: my colleague’s system is a Surface 2 Pro running Hyper-V on Windows 8.1. And the source system was a personal Retina MacBook Pro running Parallels 9. And the VM is running Windows Server 2012 R2.

The core issue was that the source image in Parallels was a virtual UEFI system with a GPT partition table. The destination image was a Gen 1 Hyper-V VM with a virtual BIOS, which necessitates a MBR partition table. (I was unable to get the Gen 2 Hyper-V VM that supports UEFI to work given the differences in hardware drivers.) 

The first problem is actually converting the disk file itself, as they are in very different formats. And the steps of using VMWare and VirtualBox utilities to do the conversion did not work – I ended up with a corrupt virtual hard disk (probably because of this GPT issue). 

So to do this, I found the awesome Disk2VHD utility from SysInternals (http://technet.microsoft.com/en-us/sysinternals/ee656415.aspx). With this utility, I ran it and chose not to grab anything except the C:\ partition and voila – I created a new VHD that I was able to load up on my Hyper-V server and at least not have corruption.

Then, the fun of trying to get it to boot began. The normal way to fix a modern Windows system that will not boot is:

  • Boot from Windows installation image
  • Go to Command Prompt
  • Enter the following commands:
    • “bootrec /fixmbr”
    • “bootrec /fixboot”
    • “bootrec /scanos”
    • “bootrec /rebuildbcd” – except this critical last step fails because the partition table is not MBR.

So, you have to convert the partition table to MBR. And that’s where life gets interesting. After trial and error, I found the following to be the best way to do this without losing data:

  • Download the latest Live x86_x64 Fedora image from http://www.fedoraproject.org/. Yes, you really need a live Linux image because of a really handy utility called gdisk.
  • Boot the VM with the Fedora Live image and login to the desktop.
  • Run the Terminal utility.
  • Then, run the following commands:
    • “su -” – get into administrator mode
    • “yum -y install gdisk” – actually install the gdisk utility
    • “gdisk /dev/sda” – start running gdisk against the virtual hard drive
    • “r” – to enter gdisk recovery/transformation mode
    • “g” – to convert from GPT to MBR
    • “p” to preview the converted MBR partition table
    • “w” – to write changes to disk.
  • At this point, the VM can be rebooted with Windows installation media to get to a Command Prompt to fix the rest of the problems. Once there:
    • “diskpart” – enters the Windows disk partition utility
    • “select disk 0” – selects the boot drive
    • “list partition” – should show the disk partitions present. The largest one is the actual one that you want to boot. It is probably going to be the third or forth partition on the disk. In my case it was the fourth.
    • “select partition 4” – select the partition we want to boot.
    • “active” – to mark the partition as active in MBR.
    • “exit” – to exit diskpart
  • The next step is to find the actual disk itself – it is probably going to be D:\ (at least it was on my system) because of the reserved GPT partitions in advance of the actual usable partition. Once you determine the drive letter, you can proceed as follows:
    • “bootsect d: /nt52 /force /mbr” – makes D: bootable.
    • “bootrec /fixboot” – fixes core startup environment
    • “bootrec /fixmbr” – fixes core startup environment
    • “bootrec /scanos” – find the OS; note it will probably be D:\Windows in my example (and this is OK)
    • “bootrec /rebuildbcd” – update the BCD environment; note it will be D:\Windows in my example (and this is OK)
  • At this point the system can be rebooted without any installation media – and it should just boot up with everything in C:\Windows\ as it should be. Once booted:
    • Uninstall Parallels Tools
    • Install Hyper-V Integration Services (if required – not applicable on this system since Windows Server 2012 R2 has them built-in)
    • Reactivate Windows

And voila! I hope this helps.




My Take: Adobe Creative Cloud, Office 365, Software Subscriptions, and Old Photos

OK, so I am taking the plunge. Sometime before the expiration of the July 31st cut-off for the CS6 migration promotion, I’ll be switching to Creative Cloud. Why? Feature-value.

I have been playing with an evaluation of Lightroom 5 (which coincidentally is the only product still available on a perpetual license). I’m sold. And now I want what’s coming in Photoshop CC. Why? I’ll let you be the judge.

Let’s start with a photo I took in 2006 when I was much more of a novice (though to be fair, I’m still a novice) in DSLR photography:


It’s a nice photo. But it could be oh, so much sharper. Maybe, something like this – tweaked with the new filters in Lightroom 5:


Now, I can’t wait to play with the even better features in Photoshop CC. I have a lot of good photos from my early days when I didn’t know what I was doing (and shooting only in JPEG with lower quality lenses and a much older body). The software has the potential to breathe new life into a lot of old “good” photos and make them “great” photos. The value is there. Sign me up. Especially while the CS6 promotion is in effect.

That said, the outcry over the subscription licensing is amazing. I have two perspectives on this. The first is from an industry direction. This is the wave of the future. Creative Cloud and Office 365 are the most mainstream examples. The trend started with lower end consumer services like ad-free versions of popular services as well as large enterprise Line-of-Business applications like Salesforce. Now “professional” and “mainstream” applications are filling in from the middle.

The pace of innovation in the technology industry no longer allows for multi-year ship cycles, but instead demands near real-time delivery. And the licensing models have to change as a consequence. Outside of the consumer mainstream, most organizations were paying for perpetual licenses and maintenance. The reality is that subscriptions are comparable – they just look different. And in some cases they are more cost effective. The biggest downside (and upside) is that you are turning a CapEx into an OpEx.

As a case in point, if you bought every Adobe Creative Suite upgrade + Lightroom upgrade the day it came out (e.g you’re committed to innovation, generally like me) – you are paying about the same per year as if you are signing up for Creative Cloud. You are just paying a big lump every year to year and a half, instead of every month. It feels different. But at the end of the day, it’s about a wash.

That being said, whenever you change how something works, there is the process of change management. This is where Adobe made a very polarizing move by forcing people to switch – and in a very short time. So far the bet appears to be paying off based on their recent financial results. Time will tell.

It could have been done with a lot more grace, however. Microsoft, who can do no right in people’s eyes no matter what seemingly, has done a much better job with Office 365. They have perpetuated their old business models. So there is no culture shock. If you want to keep doing things the way you have always done them, that’s great. But they have introduced substantial incremental value with Office 365. To-date, depending on the option, you get things like:

  • Office for iOS
  • Varying levels of Web/cloud features (e.g. – Skydrive storage, Web applications)
  • Cost/benefit of cloud infrastructure vs. running on-premise (the real benefit varies by organization/user)
  • Multiple desktop OS support (e.g. – you can be Web, Windows, or Mac or some combo)
  • Multiple device support (which is a huge first for Microsoft)
  • And more coming seemingly all the time…

This was a much better way to manage the change – as adoption is taking off and nobody is screaming much (yet).

In closing? I’m excited about breathing new life into a bunch of old photos, absolutely resolute that subscription models are the wave of the future, and convinced that the key to doing it right is having an elegant transition like Microsoft versus an electroshock like Adobe.

PS – In-Flight WiFi really rocks. Written over Montana at 34,000 feet on United 675.

My Review of Windows Phone 8

My first smartphone was a Windows Mobile device – a Motorola MPX200 to be precise. And that was all I used for the next few years culminating with the Treo 750W on Windows Mobile 6.5.

Then, the iPhone 3G came around. Everything about my iPod and phone was combined into one. And everything Windows Mobile 6.5 did, the iPhone did better – or had an application for that. I have been a steady user of the iPhone, having used the 3G, 3GS, 4, and 4S and frankly never looked back.

With the debut of the Nokia Lumia 920 and Windows Phone 8, it seemed to be time to give Microsoft another shot. So while in Canada I picked up an unlocked penta-band device and used it on both Bell Mobility in Canada and AT&T in the United States. And this is my take after some hardcore daily usage using the “Portico” update.

The Good:

  • Industrial design and build quality is fantastic
  • Voice quality is as good or better than any cell phone I have had since switching to a GSM phone in the United States
  • E-mail, contacts, calendar, and tasks are best-in-class; I live by Outlook and this is by far the best experience I have used on a mobile device
  • Integrated Office support is also best-in-class; zero issues opening and working with documents
  • Bluetooth contact pairing and caller ID support is better than any device I have used
  • Battery life has been outstanding
  • Contact filtering is truly useful – especially once you merge in all of your social networks. The search to then get back to all of your contacts is extremely well-done.
  • The Nokia application collection makes the phone truly stand-out (at least amongst its peers) and helps close the “app-gap”
  • Camera quality is absolutely fantastic (daylight focus issues fixed after the “Portico” update)
  • You can use the device while wearing gloves – very handy when working in and traveling to cold climates like Ottawa
  • Native QR codes support in the Bing application is seriously cool

The Bad:

  • Dial functions over Bluetooth break routinely in the car; after placing a call it becomes unpaired from my vehicle and I can never get it to repair without rebooting the vehicle
  • There is a serious lack of applications – including some promised ones. Notable misses from things I have become used to or depend upon:
    • Instagram (despite being promised)
    • United Airlines
    • Air Canada
    • Comcast’s series of applications
    • Tivo
    • Google+
    • Urbanspoon
    • Opentable
    • Yelp
    • Egencia
    • eBay Motors
    • Uber and Taximagic
  • Synchronizing the device with my iTunes Library was a disaster, despite this being a “feature”:
    • It took over 2 days to synchronize my photographs (8,000) – when continuously connected and synchronizing
    • Seemingly less than 15% of the music tracks I selected to sync actually copied – despite them all being either ripped from CD, pure MP3, or DRM-free music
    • I was never able to send an audio file to use as a ringtone successfully
  • IE10 is just not as smooth nor as well-performing as Safari on a handheld device
  • Some applications are substitutes and not as good as their original versions:
    • BoxFiles (in lieu of Dropbox)
    • MetroTalk (in lieu of Google Voice)
  • Other applications I use such as Evernote or Twitter or Facebook are lacking features from their iOS brethren
  • The Lumia 920 is not preconfigured for other carriers as an unlocked iPhone is; instead you have to know to download the Nokia Access Point application, find it (it’s in Settings and isn’t an application), and then occasionally tweak it further if a setting has changed – it is not a world-ready plug-n-play solution

The Verdict:

I wanted to make this my primary phone. I really, truly did. But it was actually the basics  that killed it for me. I use Bluetooth extensively; having it not work reliably is a non-starter. Likewise for sync. – if I have to dredge up an iPod and start carrying yet-another-device for a primary use case, it’s kind of defeating the purpose of having a multifunctional device.

The lack of applications is definitely annoying. But it not the end-of-the-world. Though I imagine if I had used it for more than a few weeks, it would have gotten on my nerves.

That said, I will definitely miss the superior e-mail, contacts, calendar, and tasks interface – coupled with native Office support.

So, for now, Windows Phone 8 and the Lumia 920 has earned it’s keep as my international phone for use while traveling on other carriers. I only hope that with further evolution some of the gaps can be closed and it will be able to make it to my primary phone.

It’s frustratingly close…

Out of Pseudo-Stealth Mode

To summarize the last twelve months: wow!

Since the acquisition of Cactus Commerce by Ascentium, a big part of my day job was working on integrating the two businesses into a new brand and value proposition. This finally came to fruition with the launch of the SMITH brand a few weeks back, along with the retirement of Cactus Commerce and Ascentium brands. Check out www.smith.co to see the results!

But, the real focus has been capitalizing on the opportunity with the transition of the Commerce Server business. The last year has seen us extracting the product from Redmond; re-branding it; doing lots of 1×1 engagement with customers, partners, & analysts; and now debuting what’s next…

Last Friday brings the Release Preview of Commerce Server 10, which addresses being CMS-agnostic and the challenge of proper separation of business and presentation logic to allow HTML/CSS/JavaScript designers to make look-and-feel changes. Likewise, our new brand – commerceserver.net – debuted, which sees the right management structure and branding to effectively curate the global partner ecosystem for the product.

The initial reactions have been very positive – nothing like Peter Sheldon’s excellent post from Forrester Research to sum it all up: http://blogs.forrester.com/peter_sheldon/12-11-30-commerce_server_cactus_commerce_ascentium_the_path_forward_0

These changes have been a long time coming. We had to get it right. And with them finally seeing the light of day, I can finally come out of pseudo-stealth mode. 🙂

And to add ANOTHER adventure…

As if being acquired, a new job, and an international move were not enough, let’s just add getting the helm of Commerce Server back. No, that’s not a mis-print, as captured by:

This certainly was almost as much of a surprise to us (and has been only a short time that we have known) as I am sure it is to customers. However, this is one challenge I relish – as the opportunity is great to truly take the product now, evolve it into what customer’s want and need, and do so free of the many constraints under which it has been placed in the past…

The Next Adventure Begins

On September 6, Cactus Commerce, the company for which I have worked for the last for years, has been acquired by Ascentium Corporation of Bellevue, WA. It’s truly ironic that after moving to Ottawa from Seattle, we’ve been bought by a Seattle-based company – and now I’m being relocated back to Seattle to take on a new executive role within the combined organization. With this, I expect things are going to be quite busy. And not as many posts for a while.

But in the meantime, you can read all of the details here:

Stay tuned…