Monday, 28 January 2013

Windows 7 Virtual WiFi Hotspot, Changing DHCP Pool

First, another winge in my series of "Linux vs Users list". This is one of those functions of an OS you'd think should have been there forever: a PC with a wireless adapter should be easily configurable as a Wireless Access Point, right? And Linux seems to have all the right tools and be perfectly geared to do this, right? Well in theory yes, but in real life getting the software to perform this task compatibly with your wireless adapter is pretty complicated, so if the hard working Linux hackers find this hard to achieve, what can we expect from Microsoft, who generally works with it's back to the user's needs, right?
 
Well wrong.
 
With Windows 7 you get something called a Virtual Wireless Adapter, and what they call a "hostednetwork", which basically creates what its name says, a virtual WiFi card secured in WPA style and assigns DHCP addresses to the clients that you connect to it. There are quite a few tutorials googleable on the Internet to guide you in the process (http://www.wi-fiplanet.com/tutorials/article.php/3849841/How-to-Create-Wireless-Hosted-Networks-in-Windows-7.htm, http://winsupersite.com/article/faqtip/windows-7-tip-of-the-week-use-wireless-hosted-networking-to-share-an-internet-connection-wirelessly). This works pretty straightforwardly, but being Microsoft not everything could be that easy could it?
 
Two issues arise. First, this is a manual process, i.e. you have to run a command line, there is no automation or recovery after a restart or suspend-resume; this can be partially overcome by running it as a script on startup (see http://www.addictivetips.com/windows-tips/how-to-run-programs-automatically-on-windows-7-system-startup/, inserting the command line netsh wlan start hostednetwork), although for auto-reconnect or starting up after a resume more complex solutions would need to be programmed.
 
The second issue is that the DHCP IP address pool assigned to the clients are seemingly fixed, in the range of 192.168.137.x. This might not be an issue for most users, but if this is integrated into a larger LAN where these IP might conflict with others, or if like myself you simply don't like the number 137 there should be a way to change that.
 
Well yes, it is actually possible, but in true MS style it isn't easily accessible. As explained in various sites (for example http://support.microsoft.com/kb/230148) you must modify a set of registry keys, under the following tree:
  •  HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\SharedAccess\Parameters
The key values to change are ScopeAddress, ScopeAddressBackup & StandaloneDhcpAddress, which should all have the same value corresponding to the first IP address of the range you want to assign (i.e. 192.168.12.1 would give you addresses in the 192.168.12.x/24 range). The network address assigned is limited to a 255.255.255.0 netmask, non-configurable. Once changed, restart the Internet Connection Sharing service (you can find it typing Services in the Windows Start Menu search bar).

 
One additional consideration that although once encountered seems pretty obvious, does not seem to be addressed by any of the sites I have found: if you change the DHCP IP address server in the registry, you must also modify the static IP address assigned to the Virtual WiFi Adapter (type view network connections in the Windows Start Menu search bar, right click and Properties on the adapter identified as Microsoft Virtual WiFi Miniport Adapter). Here you must switch the value to the same one used in the registry, in the previous example, 192.168.12.1).
 
 

Saturday, 26 January 2013

Linux vs Users

So I’ve been using Linux for 16+ years now, back when Suse Linux and Red Hat distributions enticed me towards playing around with a free, in all senses, operating system. Things have gone a long way now but it seems the main players, I’ll call them Canonical, Gnome, Red Hat and friends are not learning from their history and insisting on mistakes that are turning me away from the whole Linux scene.
 
Back in the day two equally important motivations were central to my adventure towards the free Unix-like systems: the “hacker” aura that surrounded Linux, which allowed you to get your hands well and truly dirty into the functioning of a pc and it’s programs, and that Microsoft’s operating systems of the time were frustratingly uncooperative and unreliable (Windows 95, Windows 98, Windows Me to name a few). At the early stages of Linux you practically had to become a hacker (ok that’s exaggerating) to get a full working system up and running, what with understanding how to compile and install the right kernel, setting up the boot loader properly, manually editing X configuration files for the graphical interface to show up, wrestling with the correct modules and config files to get your network setup (not to mention the fun with dial up modems), enabling your printers and input devices.... All that before you started to install programs that actually did anything, and started all the fun with library dependencies and version number incompatibilities, different file and folder locations between distributions or even versions of the same distribution and thousands of other issues that made finally having a system with the setup you needed quite a learning process and something you felt proud about. And you got to learn a lot about what makes an OS tick, programming in multiple languages/ scripts and generally the hacky way of getting around and solving problems in the IT world.
 
Meanwhile Windows was BAD. Not only was it created by the company that defined evil, it was not good, it crashed your pc, spat out blue screens of death, made your computer feel like a steam engine with no steam, and generally made changing, configuring or debugging anything a trying task… Oh and it cost money, which is always a bad thing (free is always better than anything else). That everyone in the whole wide world used Windows only made us Linux (or other Unix flavours) people that much wiser. And cooler. And bester.
 
Then things evolved. Linux started to change, Gnome and KDE unified in their own way a graphical desktop experience, things started to auto-configure and auto-detect themselves and it became much easier to achieve the cool things you could already do before but with much less effort and knowledge of system internals. New distributions started to include assistants and GUI programs that actually worked (sometimes, most of the time) and made getting your setup ready less of a nightmare. Everybody could communicate and share over Internet which widely helped the “business” model of Linux, so if you have a problem or there is something that you don’t know how to do you can google (bing?) it and someone else smarter than you has already had that problem and has solved it. People that need software solutions can tell the guys designing and programming them what they want and how what they’re doing is bad and so everything is more streamlined and opportunities for innovation are easier to detect. There are many more Linux users, although still proportionately much less than those using (bad) Microsoft’s OS, the community is strong and some of the best ideas come from these people. Even Apple has based their OS on their own Unix flavour, and although you also have to pay for Apple products, they are GOOD (and they make things that look nice also!).
 
At the same time, someone at Microsoft must have thought, “hey, why don’t we actually make something good, since we’ve got all this money and smart people and all that?” and they actually did. Windows XP was, well, not bad, and Windows 7 is probably the best OS Microsoft has built: stable, usable (sort of) and, most important of all, THINGS WORK. I’ll clarify; if your system is not old (by Microsoft’s standards) and has Windows 7 compatible components (that is a bit obvious, but I’ll explain later), you plug the OS into your pc, it will rumble and roll and get your stuff working. Fiddle around and configure and set things up properly and it will even work decently. Your programs, devices, games, websites, social things, everything will go. Windows 8 (although in early stages) seems to be much the same, keeping in mind all the fun with the Metro (that isn’t it’s proper name is it?) Interface and how most people seem to hate it.
 
Meanwhile, the big Linux user distributions, Ubuntu (and versions), Red Hat/ Fedora, openSuse, Debian, Mandrake, where are they going? It seems in the opposite direction! I’ll make my case with Canonical’s Ubuntu, but the same could apply to Fedora or Gnome (see what Linus Torvalds – the guy who named Linux - among others thinks about Gnome 3). I have installed and setup Linux on thousands (if not millions! Maybe a few less) of pcs in both domestic and server environments and the first thing always was there, it works; maybe not all of it, maybe not every component is running full tilt of it’s capabilities or some programs need a bit of tweaking to get them working, but your basic system is a pretty much straightforward installation as long as your system is not dated from the cold war era.
 
But as of late, this just does not happen. On pretty recent systems (< 2 years old) with relatively straightforward setups, you install the OS and it won’t recognize your Wireless adapter (or it will during the setup and won’t once the system is installed??!?!?) or it crashes your system every time you suspend/ hibernate/ power up to name two of my most recursive and annoying failures encountered. Sure most of the time if you spend some time on Internet looking things up you can get a solution to your problems (or a workaround), but that isn’t what Canonical (in this case) is selling: Ubuntu is for humans right? As in not for geeks/ hackers! They do things like the Unity interface basically against the will of the community (I particularly only like the quick search functions but, hang on, didn’t Windows 7 already have something similar in the start menu?), pushing a 3D interface and modern features, but forgetting, in my opinion, what is one of the main good things that Linux always had: old stuff worked too, better, faster and more reliably than in other OS. A lot of good work has gone into the repository and software distribution system, making it easier to be up to date with current versions of your software and dependency issues, and for software developers to make their creations easily available, although Canonical only seems to have learned from how Google and Apple have handled their mobile app distribution systems the need of a showcase or shop to add visibility to programs.
 
Returning to the “hardware compatibility” issue, Linux has everything stacked against it, as hardware and system developers in the proprietary and commercial world do not release drivers/ API for Linux systems nor provide sufficient information for the community of developers to be able to create the necessary software, and I understand that it is through great creativity and effort that any compatibility is achieved, with pretty admirable results. I think for any software developer in the world it is much “cooler” to create new and innovative things and design new interfaces and ways of attacking problems than to debug and ensure reliability and backward compatibility. And I am not saying that there is considerable effort in this direction, specially considering what people are generally paid for doing it! This is where organisations like Canonical, with sufficient resources, should be putting considerably more effort, making Linux a rock solid system and focusing less on dubiously creative ideas with little acceptance in their user base. I am far from being a “newbie” in the IT world and I personally have lost patience with all the issues that arise with just a basic system, I cannot imagine what a normal user thinks when he adventures into a “user friendly” Linux distribution like Ubuntu and gets hit flat in the face with so many issues. It’s a vicious circle: Linux doesn’t have more users because it’s hard to use, and it’s hard to use because more companies do not dedicate resources to making it work out of the box and easier to use, and those companies don’t dedicate more resources because the user base doesn’t justify it.
 
I like Linux, many of my favourite tools are Linux only (or the Linux version is better than the Windows/ OSX version), I like the free/ open source software model and I like being able to hack into my system and make it do cool things only I can do (after reading the expert advice of thousands of gurus on Internet of course!), but in real life my pc is a tool, a means to be able to do things, not an objective in itself, and right now I can only get that properly with the bad guys. Does Linux want to remain a geek and hacker’s toy or become a proper domestic user friendly system?