My Sojourn with Android

I am a global traveler, spending a significant amount of my time in our Ottawa, ON; Copenhagen, DK; and Brussels, BE offices combined with a fair bit of time in the United Kingdom. Being connected is part of my job, which means having a local mobile plan is a must. I have 3 SIM cards (US, Canada, and Denmark with EU-wide coverage). But 3 phones is at least one too many to carry.

I broke down and bought a Samsung Galaxy S7 Edge Dual-SIM unlocked from Amazon. This is the closest Android equivalent (sans the single SIM Nexus 6P, which is what I really would have liked if there were a dual SIM variant) to my workhorse iPhone 6S Plus.

And, after two weeks, I thought it’d be interesting to journal my impressions of Android. On the hardware front, I’d give the Samsung a more favorable impression than Apple on a couple of fronts:

  • Curved display
  • Waterproofing
  • OLED screen

But it is a step backward in terms of camera and storage (as you lose the MicroSD support when using dual SIM’s and I’m not brave enough to try the hack). Battery life is about the same.

On the software side, the two couldn’t be more opposite:

  • Android has a setting for everything; figuring out what is what can be quite confusing and Search is your best friend. But, you can personalize the device until you’d hardly recognize it from the default.
  • All of the app-store applications are there – but some are clearly not as polished or stable as their iOS brethren.
  • Android from Samsung ships with both the Google Apps as well as Samsung Apps + Samsung-based lock-screen and launcher. In theory, you can replace all of these with apps of your choices…
    • In practice, I have not been able to replace the Lock Screen without running into challenges.
    • I have replaced the Launcher and all default applications with Arrow and the applications from Microsoft.
  • Chrome vs. Safari is a matter of personal preference; I prefer Chrome but still end up using Safari given Apple’s reluctance to truly let the browser get replaced on iOS.
  • iTunes is definitely a missing link (though with limited storage I could not sync that much anyways).
  • Dual SIM support is pretty elegant, but you can only have 1 primary SIM working at LTE speeds and you have to manually set which one is primary (which was slightly confusing to figure out).
    • The inability to have separate data roaming settings for each SIM is definitely a missing hole I hope gets fixed in Android Nougat.
  • Software updates to Android seem more frequent than iOS; however OS upgrade-ability is a huge problem. Who knows when this S7 Edge will see Android Nougat?

My overall verdict is that there is plenty of room for both Android and iOS, but that other operating systems are probably a lost cause at this point. How to size them up:

  • If price matters, Android is the way to go, as you can get a very affordable device with a complete app store repertoire.
  • If you want to customize everything, Android is the way to go. You just can’t customize to this degree with iOS.
  • If you want an absolutely seamless, intuitive experience where everything “just works” – then the choice is simple – get an iPhone.
  • Those that are heavily invested in Apple ecosystem’s choice is also very simple: iPhone.

This post wouldn’t be complete without mention of Windows Mobile 10. I very much wanted to be able to buy a Lumia 950 XL. This was the phone I really wanted to buy. Microsoft does dual SIM better than anyone and it has the same polished end-to-end experience as iOS. However, the uninspired industrial design of the 950XL (which is outshone by the lesser 650) coupled with longstanding reports of bugs or carrier feature incompatibility (no WiFi calling) and an absolute lack of applications made it a no-go.

Would Android make me switch from iOS? No…but it is interesting and fun to have it and I will definitely enjoy it while maintaining connectivity in Canada and the European Union (and I guess now Great Britain).

This is also a perfect product management case study of customer segmentation and the room-for-both phenomenon (and conversely the lack of opportunities for Windows Mobile, BlackBerry, or others).

Advertisements

Windows 10 and the Surface: Winning Me Back

To celebrate the 30th birthday of Windows, I figured it was time to share my own thoughts on the operating system.

With the Threshold 2 build of Windows 10 (build 10586) and the third generation of Surface devices (Surface Pro 3, Surface 3 LTE), Microsoft is winning me back. For those that know me well – that may be a surprising statement. But it’s true and I have to give credit where due.

By mid-2006, I had switched all of my personal computing to Intel-based Macs instead of upgrading to Windows Vista. From 2008 onward, I used a Mac almost exclusively as my daily machine and used Windows in a VM or BootCamp only where necessary. Although Windows 7 was good, the hardware available simply wasn’t that great. At the same time I traded my trusty Treo running the original Windows Mobile for an iPhone 3G and never looked back.

Windows 8 and its attempt at redemption – 8.1 – was a disaster. Windows RT was a disaster. The first generation of Surfaces were great to look at and touch – but not that great to use in real-life.

Then, Microsoft woke-up and re-focused itself on its customers. Relentlessly. And its working.

The Surface Mini was killed. Given Windows RT, it rightfully probably never saw the light of day. Windows RT was rightfully killed – though it represented an interesting missed opportunity I may blog about as a different topic.

But, the Surface Pro 3 emerged and it is a genuinely great device. All the rough edges of the first- and second-generation devices were filed off. Now, finally, there was a decent Windows laptop (some software/driver quirks notwithstanding). The Surface 3 LTE gave those that are hardcore mobile users something that was very decent (also with some software/driver quirks). The Surface Pro 4 is nicely providing refinements where needed – and I can’t wait to get one when the model I want is available. And the Surface Book – although controversial – is a very interesting contender against the MacBooks.

Windows 10 is another story. It is actually very pleasant to use. All of the goodness of Windows XP and 7 is brought forward. The good ideas of Windows 8 and 8.1 remain. But its nicely wrapped into a single package. And it delivers an experience and applications that are quite pleasant when faced up against stiff competition from Apple. And, with Threshold 2, it mostly works as expected from a stability perspective.

How do I know its working? I had several devices to pull out on this long Washington DC->Seattle flight tonight; the Surface Pro 3 with Windows 10 on Threshold 2 won. As the Surface 3 or Pro 3 did in many meetings this week while traveling.

And, it’s not just me. I have traveled extensively for the last four years. Airports and airliners are a great litmus test of public opinion globally. Apple was clearly winning. But now I’m seeing Surface Pro 3s/Surface 3s pop up more and more. And most of the ones I see lately are Windows 10. This is a much needed regaining of ground with the high mobility professionals that has been absent for Microsoft for many years. And in an even bigger testament – the Surfaces have captured the interest of my mother, who has used nothing but Apple since 2006.

The recipe for this success is very simple. They are focusing relentlessly on their customers and on delivering a quality user experience. There’s no rocket science to this; its just just very hard work.

On the hardware front, I applaud Microsoft for the Surface. They have brought the much needed cachet back to Windows machines – something the OEM partners were definitely not doing. Although its taken far too long and several iterations, credit where its due for finally building a great desktop operating system and hardware combination.

(This post was written over Idaho on my Surface Pro 3 running Windows 10 Build 10586.)

Misadventures in Home Networking (or creating house-wide 2.4 GHz and 5 GHz coverage)

One of the perils of having a large square footage house built in the 1980s is that it is impossible to get WiFi coverage from a single access point. And there is no Ethernet wiring in the walls. This is the tale of how I actually got this problem reliably resolved such that there is decent WiFi coverage in the house and on the two outdoor patios.

Resolving the Cable Backbone (or lack thereof)

Short of pulling CAT6 through the walls, there represents two technologies to create a virtual cable backbone. Powerline Ethernet (which as its name implies uses the electrical wiring) and MoCa (which uses the coaxial cable for TVs). Both offer several hundred megabits of capacity.

MoCa seemed the ideal solution – except that I was unable to get it to work due to IP address conflicts with the neighbors (seriously!). Turns out (from a very helpful upper tier tech at Comcast) that I probably need a MoCa filter as the cable enters the house – and these did not exist in the mid-1980s.

Instead, I used PowerLine networking with hubs/access points in strategic locations going back to my office, where it is patched into a Gigabit Ethernet switch, which in turn is connected to my router. I plan to go back and revisit MoCa once I can source and try a filter.

WiFi Hardware

For the routers themselves, I went with the Asus RT-87 and am using the open source Merlin firmware. The main one is configured as a router, the second and third as access points. IPs are managed by redundant DHCP Servers running on Windows Server 2012 R2. DNS and WINS are both local, though DNS forwarding is going to Comcast’s DNS servers. The reason I went with this approach is, beyond having greater control, DNS is automatically updated so it makes it easy for other devices to find each other on the network. The other custom setting besides WPA2 AES encryption I enabled was dropping signals if less than -70 to force devices to switch access points while traversing the house.

The WiFi Channel Config

Getting the three access points to play nice with other is no mean feat. The reason is that if they are all configured identically, they will be overlapping with each other. So the 2.4 GHz and 5 GHz radios each need to be configured to not broadcast over each other.

The 2.4 GHz Channel Configuration

2.4 GHz channels can be configured to use one of a handful of channels and 20 Hz, 20/40 Hz, or 40 Hz channel widths. 20 Hz channels should be selected unless one only has a single access point, in which case 20/40 Hz should be selected. 40 Hz likely won’t be compatible with many devices. If any 40 Hz channel width is selected, effectively the entire spectrum is then consumed. So, with three access points, the only channels that don’t overlap are:

  • Channel 11
  • Channel 6
  • Channel 1

– making the choice quite simple, with a cap of 3 semi-overlapping access points.

The 5 GHz Channel Configuration (and where it really gets complicated)

5 GHz channels are a bit more complex, as you can configure 20 Hz, 40 Hz, or 80 Hz channel widths along with respective fallback. Using 20 Hz channels makes little sense, as you’d be unable to leverage the higher bandwidth of the newer WiFi standards. The latest AC standard requires 80 Hz channels.

Using a 40 Hz channel width, you can get Channels 36, 44, 149, and 157 on most access points – none of which will overlap. This means you are limited to four access points. If you want to use 80 Hz channels, then you are limited to two access points – on Channels 36 and 149 only.

The other wrinkle is that the 1xx channels are not visible to devices several years old. But since everything here is new enough (2012+ or could use 2.4 GHz), I went with 40 Hz channel width and Channels 149, 157, and 44. At some point, I may explore enabling 80 Hz frames on Channel 149 for the main floor access point (and most used) and then using the other two at 36 and 44.

Parting Thoughts

This was far more complicated to get right than I ever envisaged. at this point, if I need more coverage, I’ll have to get repeaters given channel saturation. And there is not a lot written on how to set this up. Commercial grade WiFi solutions make this easier – such as a Cisco Meraki system – but those cost far more than anyone would reasonably be willing to invest in home WiFi. Eero looks very promising – but is not yet available. So hopefully this helps you if found in a similar situation.

Home WiFi Misadventures

The point of this post is to hopefully save any reader the aggravation I just went through over the last several hours coupled with the years of missed performance.

The Topology:

The main access point is an Apple Airport Extreme 802.11ac access point connected to a Comcast cable modem at 1GB/s. The access point is connected to a 600MB/s PowerLine network via Gigabit ports and a Gigabit switch for local hardwired clients. There are two Apple Airport Express 802.11n access points connected via the PowerLine network for the upper and lower floors of a three floor home, respectively. Everything was auto-configured using Apple defaults.

The Symptom:

I just purchased a Surface Pro 3. And discovered that its WiFi speed was 3MB/s. After resolving several Microsoft-related issues, I boosted performance to a whopping 15-20MB/s. Not settling for this, I spent several hours researching…

The Problems:

I discovered several issues upon troubleshooting everything:

  • The out-of-box defaults use a single SSID for both the 2.4 GHz and 5 GHz band. It turns out that a number of Windows systems do not like this configuration and will end up on the 2.4 GHz band. Based on anecdotal testing, I may have had some Apple devices in this category as well. 
  • Out-of-box defaults left no control over channels. Clearly I had some channel overlap between the access points going, which explained why performance degraded when devices were active across more than one access point.

The Solution:

  • Configure a separate SSID for the 2.4 GHz and 5 GHz ranges. Point any 5 GHz capable device to the 5 GHz SSID. 
  • Manually configure the channels for the 2.4 GHz ranges
    • Channels 1, 6, and 11 were used – as they do not overlap. Note: You can really only get 3 2.4 GHz access points in a given location before you start getting channel overlap as the spectrum used per channel overlaps with its neighbors (e.g. – Channel 1 overlaps with 2, 3, 4, and 5).
  • Manually configure the channels for the 5 GHz ranges
    • Channels 36, 40, and 44 were used; there are varying reports that these would slightly overlap but no issues were seen in routine testing

Now, 5 GHz devices routinely get 100MB/s+ from the Internet when on the main access point and 50-60 MB/s when on a remote access point going through the PowerLine network. 

Hope this helps!

Lots of Good Stuff from Build and Beyond

So far, Microsoft has done a very good job of impressing me the last week. In fact, I have seen some of the best decision-making out of the company in years. Although a lot of this may be due to One Microsoft and other forces already in play, some credit also certainly goes to Satya Nadella in his new role as CEO.

In short, what I saw that I liked:

  • Office for iPad – The overnight success of this shows that Microsoft has been leaving money on the table, even with a 30% cut going to Apple. Given its a v1.0 and is this polished, I look forward to what the updates will bring. Hopefully they will update it frequently in the style of a true iOS application and not on a Microsoft ship cadence. It has earned overnight status in my daily-use list and is hopefully a harbinger of a lot more to come of delivering great experiences on other platforms.
  • Scott Guthrie – I worked for Scott directly for two years. I would count those as amongst the absolute top highlights of my career. He is a rare mix of technical excellence, but more importantly – customer focus. Including looking beyond the boundaries of the Redmond campus. He is absolutely the right choice to deliver technically innovative products that will do what people want. Congrats to him and kudos to Satya.
  • Universal Applications – This has long been one of Microsoft’s biggest opportunities. They finally delivered. And offered some decent migration strategies to boot. Microsoft built much of its success on developers, but that has gone by the wayside for a while. This felt like the start of them getting their groove back. The Xbox One support is a fantastic touch and key differentiator – and may end up seeing me get one even though I am a staunch anti-gamer in the wake of the discontinuation of Flight Simulator…
  • Cortana – This one exceeded expectations. Microsoft played to its strengths and brought in context from all of the other things they have the ability to touch, such as your inbox and calendar. And gave you the extensibility to teach it (also playing to one of their historical strengths not seen lately). They acted uncharacteristically cool by maintaining the codename from the Halo franchise, using the actor from the Halo games, and hiring screenwriters to make the dialogue entertaining. Now, can we get a desktop and tablet version please? Maybe, unlike Google Now, it won’t think I work at the local QFC (another story)…
  • Mea Culpa on Windows 8 Desktop – The renewed focus on the desktop was great to see. It’s what people fundamentally want. Tim Cook was right on this one; you have to have a device appropriate experience. Which Windows 8 failed to deliver. And coupled with the lack of applications on the tablet side, it turned into a disaster. Giving people the experience they want and a true universal platform gives it hope…we shall see.
  • Free Windows on < 9″ Devices – This was the unexpected one (along with offering O365 via the Apple App Store). Microsoft has long held onto its licensing policies with religious fervor. Seeing them be pragmatic with respect to where they are at in the market was a refreshing change, and one guaranteed to not have happened under the previous regime.

What was unimpressive (beyond Cortana) was Windows Phone 8.1, sans the very elegant dual-SIM support (very key for international travelers like me or those in emerging markets where multiple carriers are a reality). This was a me-too play that should have been there all the way along.

Overall, my expectations have been exceeded. Kudos to everyone in Redmond – and especially all of my friends and former colleagues. Congratulations on delivering some of the best in a very long time and finding some of the right grooves again.

Converting from Parallels to Hyper-V

You would think converting a virtual machine from one format to another would be simple. Especially when there are multiple blogs out there on how to do it. But, of course not – so time for another misadventure in computing post. I was asked by a colleague to get a copy of a demo virtual machine for some upcoming analyst activities. The problem: my colleague’s system is a Surface 2 Pro running Hyper-V on Windows 8.1. And the source system was a personal Retina MacBook Pro running Parallels 9. And the VM is running Windows Server 2012 R2.

The core issue was that the source image in Parallels was a virtual UEFI system with a GPT partition table. The destination image was a Gen 1 Hyper-V VM with a virtual BIOS, which necessitates a MBR partition table. (I was unable to get the Gen 2 Hyper-V VM that supports UEFI to work given the differences in hardware drivers.) 

The first problem is actually converting the disk file itself, as they are in very different formats. And the steps of using VMWare and VirtualBox utilities to do the conversion did not work – I ended up with a corrupt virtual hard disk (probably because of this GPT issue). 

So to do this, I found the awesome Disk2VHD utility from SysInternals (http://technet.microsoft.com/en-us/sysinternals/ee656415.aspx). With this utility, I ran it and chose not to grab anything except the C:\ partition and voila – I created a new VHD that I was able to load up on my Hyper-V server and at least not have corruption.

Then, the fun of trying to get it to boot began. The normal way to fix a modern Windows system that will not boot is:

  • Boot from Windows installation image
  • Go to Command Prompt
  • Enter the following commands:
    • “bootrec /fixmbr”
    • “bootrec /fixboot”
    • “bootrec /scanos”
    • “bootrec /rebuildbcd” – except this critical last step fails because the partition table is not MBR.

So, you have to convert the partition table to MBR. And that’s where life gets interesting. After trial and error, I found the following to be the best way to do this without losing data:

  • Download the latest Live x86_x64 Fedora image from http://www.fedoraproject.org/. Yes, you really need a live Linux image because of a really handy utility called gdisk.
  • Boot the VM with the Fedora Live image and login to the desktop.
  • Run the Terminal utility.
  • Then, run the following commands:
    • “su -” – get into administrator mode
    • “yum -y install gdisk” – actually install the gdisk utility
    • “gdisk /dev/sda” – start running gdisk against the virtual hard drive
    • “r” – to enter gdisk recovery/transformation mode
    • “g” – to convert from GPT to MBR
    • “p” to preview the converted MBR partition table
    • “w” – to write changes to disk.
  • At this point, the VM can be rebooted with Windows installation media to get to a Command Prompt to fix the rest of the problems. Once there:
    • “diskpart” – enters the Windows disk partition utility
    • “select disk 0” – selects the boot drive
    • “list partition” – should show the disk partitions present. The largest one is the actual one that you want to boot. It is probably going to be the third or forth partition on the disk. In my case it was the fourth.
    • “select partition 4” – select the partition we want to boot.
    • “active” – to mark the partition as active in MBR.
    • “exit” – to exit diskpart
  • The next step is to find the actual disk itself – it is probably going to be D:\ (at least it was on my system) because of the reserved GPT partitions in advance of the actual usable partition. Once you determine the drive letter, you can proceed as follows:
    • “bootsect d: /nt52 /force /mbr” – makes D: bootable.
    • “bootrec /fixboot” – fixes core startup environment
    • “bootrec /fixmbr” – fixes core startup environment
    • “bootrec /scanos” – find the OS; note it will probably be D:\Windows in my example (and this is OK)
    • “bootrec /rebuildbcd” – update the BCD environment; note it will be D:\Windows in my example (and this is OK)
  • At this point the system can be rebooted without any installation media – and it should just boot up with everything in C:\Windows\ as it should be. Once booted:
    • Uninstall Parallels Tools
    • Install Hyper-V Integration Services (if required – not applicable on this system since Windows Server 2012 R2 has them built-in)
    • Reactivate Windows

And voila! I hope this helps.

 

 

 

Misadventures with Windows 8.1 and BootCamp on a MacBook Pro

This weekend, I put Windows 8.1 Enterprise Edition 64-bit on a 15″ mid-2011 MacBook Pro. It turned out to be much more of a (mis)adventure than I anticipated.

What I Tried and Failed

My goal was to simply burn a USB-stick installer, put Windows 8.1 on it, and then put the Bootcamp 5 software on it and install Windows. That’s where it all went wrong:

  • First, I used the Microsoft Store utility to burn the Windows 8.1 Enterprise x64 ISO to the USB stick – big mistake. It formats the USB stick using a MBR partition table. A Mac cannot be booted off of a USB stick unless it has a GPT partition table. To fix, I ended up using my 15″ Retina MacBook Pro to re-create the USB stick because you can’t create in MacOS a Windows installer stick unless you do so on a Mac without an optical drive.
  • Second, it turns out the Boot Camp Assistant won’t actually partition the drive unless you have a Windows installer disc in the optical drive. To fix, I had to actually burn a copy of the installation media and put it in the optical drive.
  • Third, it turns out there is a bug (either in Apple’s firmware or Windows 8/8.1 setup – my suspicion is its an Apple problem) that will not let you install from a USB stick. The error was totally non-obvious – it said that the hard disk was partitioned as MBR but it needed GPT. To fix, I ended up installing from the recently burned DVD. The problem is definitely isolated to Mac models with an optical drive – as I did not have this issue on my Retina MacBook Pro or MacBook Air without optical drives.
Conclusions
What I was able to conclude from this exercise is that if your Mac has an optical drive, chances are you won’t be able to install from a USB stick. So the correct sequence would have been:
  1. Burn the Windows DVD to a re-writeable optical disc using whatever means are preferred.
  2. Download the BootCamp 5 software from Apple’s site in Windows.
  3. Copy the folder structure to the just-burned optical disc.
  4. Run Boot Camp Assistant and install Windows.
  5. Install the supplementary BootCamp software afterwards.
If on a Mac without an optical drive, simply use the Boot Camp Assistant to create a USB stick with everything on it, install Windows from it, and then install the supplementary software.
Lastly, after finally getting everything working – I had to go into Power Settings / Advanced Power settings and disable Adaptive Brightness in order to get the screen brightness to function properly. Once all that was done, Windows 8.1 on a Mac has been fantastic.
Hope this helps!