SQL Saturday Exeter 2015 Notes

SQL Saturday Exeter was great fun and a good learning experience. If you are a data professional working with SQL Server, you should get yourself along to the next one in 2016 and get involved!

This is not polished enough to be considered a full review of SQL Saturday Exeter. It’s just my mental notes committed to page before I forget. I might tidy this up later if I get time :-)

DLM Training Workshop – Automated Database Deployment

Friday was a full day’s workshop on using Red Gate’s new solution for Database Lifecycle Management. Part 3 of a trilogy (part 1 is source control, part 2 is continuous integration).

This session was the compelling business reason to come to Exeter. At work we are planning the next phase of our automatic database deployment strategy. We already trust Red Gate’s database development tools, and we want to build on that investment.

It’s early days, but to me it looks like SQL Release is a winning product, and the missing piece of our database deployment puzzle.

I hope to write more about this soon :-)

Pirate Party

Yarr! Plastic dreads, gutrot rum, comical accents, gold (chocolate) coins, and FOOT JENGA!

Phil Factor – Spinach and Database Development (Keynote)

“What has spinach got to do with database development?”

As it turns out, they both have data quality issues in common. People believed for generations the bad data about the iron content of spinach.

Your most important job as a database professional is to defend against bad data. If you don’t do this, anything else you do will just be helping to deliver the wrong answer more quickly.

The keynote was delivered as a video and published online, so you can watch it again from the comfort of your own bed! At just six minutes long, you could even watch it over a coffee break.

Richard Douglas – Top Down Tuning

What can you do to tune SQL Server when you don’t have control over the database code? Actually quite a few things.

Richard reminded us that the server’s main job is to respond to requests for data. That data can be retrieved from three physical places: CPU cache, memory, and disk. Each subsequent location is roughly an order of magnitude slower to access than the one before. (Read Jeff Atwood‘s post on relative latencies for a great general overview of the fundamental problem).

We had an overview of the server-side lifecycle of a database query, from initial receipt of the SQL query to the final submitting of a TDS (tabular data stream) response.

A key stage of this lifecycle is when SQL Server looks in the plan cache.

Richard showed us how to reduce cache bloat with a server configuration option for storing a “stub” plan the first time around.

He pointed out that the query text must match exactly to the text stored in the cache for the cached plan to be usable. Coding standards might seem onerous, but in this case they may relieve memory pressure!

You can monitor the plan cache with an odd but effective query from Jonathan Kehayias. Odd because it use the WITH XMLNAMESPACES clause (new to me!) and XQuery, effective because you can pull out missing index information from the each XML plan into a single database table.

A related trick is that we can append OPTION(RECOMPILE) to the end of our diagnostic queries to stop them being stored in the plan cache. This defends against bloating the cache and also pushing out more important plans.

SQL Server has a threshold at which any query may be run in parallel. In theory parallelism can increase performance, but in practice the server’s default threshold is “not fit for purpose” for today’s workloads.

Before you change it, you can review the cost of the cached plans to find the ones whose plans may change.

Alex Yates – First Steps in Continuous Integration for Databases with Red Gate tools

A one-hour version of parts one and two of the DLM sessions. Alex discussed some of the reasons that database lifecycle management is so hard, and demoed some Red Gate tools that remove some of the major pain points.

Some of the issues in DLM are inherent to the fact that databases contain state that you must keep. You can’t upgrade a database by throwing the old one away and replacing it with the new version.

SQL is a (the only?) strange language in that the person doing the deployment often knows more about the language than the person who wrote the code!

Some of the issues are caused by working culture and poor communication. It boils down to “if it’s not in version control, it doesn’t exist.”

The audience was already comfortable with version control, so we went straight ahead with a demo of versioning, packaging and publishing a database project.

Demo setup: Windows, SVN, TeamCity, SQL Source Control, SQL Test, SQLCI plugin for TeamCity

  • Use SQL Source Control to script out all the database objects into a new folder in SVN
  • Add a new VCS root in TeamCity for the new folder
  • Add a new build step to the existing build configuration that connects to the VCS root, compiles the database code , and packages the database
  • Add a second build step that runs the unit tests in the package

Chris Testa-O’Neill – Cloud for the data professional

If you have been in IT for long enough you will recognise a pattern. The shift from mainframe to client-server was a lot like today’s shift from on-prem to cloud.

The reality for many organizations is today a hybrid model. Some data and applications must stay on-prem for various reasons, but it makes sense for others to go into the cloud simply because it is more cost-effective.

Machine Learning is one of the services offered on Azure (Andrew Fryer’s Machine Learning session was on at the same time). To get the most from machine learning, you need to have a solid understanding of statistics. There are many good books now that approach statics not from a dry mathematical perspective but a practical, data professional perspective. Chris recommended “Statistics for Dummies” as a good starting point.

Chris demoed how to provision SQL Server in the cloud. Once you already have an Azure account, it’s as easy as logging in, filling in a few forms with your system requirements and credentials, and waiting a few minutes for the machine to be provisioned. It looks awesome, and I can’t wait to get the chance to play with this.

Someone asked about the DBA’s role in the future. The blunt answer is “Cloud is no excuse for bad code. With all the time we save on provisioning, now we can focus on adding value through performance tuning”.

William Durkin – Stories from the trenches: upgrading SQL with minimal downtime

I’m halfway through a domain migration project at work, so I came to this session to pick up any tips about upgrades that might also apply in this situation.

The session was an overview of the high-availability and disaster recovery features of SQL Server and how you can use them to implement an upgrade strategy. Backup/restore, log shipping, clustering, replication, and mirroring all have their place.

Unfortunately the demo machine was down (!), but we had some interesting discussion nonetheless.

The best part of this was how to define availability. Yeah, people like to talk about “five-nines availability”, but in what context? Not everyone is a 52 x 24 x 7 business (five mins downtime allowed per year).

If you are more like a 48 x 12 x 5 business, you can measure your five nines within this window, which sounds tighter (seconds of downtime allowed), except that you now have all that out of hours time to do any maintenance work!

And of course, planned, discussed downtime that happens in business hours does not count towards the SLA. The SLA measures unplanned outage time.

Neil Hambly – Load Testing with SQL Server Tools

To complete the domain migration, we’ll need to do load testing to do a before-and-after comparison of performance.

By the end of the day my brain was getting pretty fried from all the new stuff (and sleep loss!) but I managed to hang on long enough to get some good stuff from this one too.

Neil demoed how the Distributed Replay features in SQL Server 2012 allow you to replay captured trace files so that you can recreate the conditions for a particular performance testing scenario.

There is a quite a lot of setup and preprocessing to make distributed replay work with the trace files, and he demoed that as well. There are some command-line tools for the distributed replay client you can use to this.

Post-Session Chat and Chillout

Prize draws, curry, beer, bed!

Also had a interesting chat with people from Micron, an SSD manufacturer. It was their first SQL Saturday. I hope they felt welcome!

We were wondering if there is ever a use case nowadays for spinning magnetic disks over SSDs. All we could think of was, if you are wanting to store the data and don’t really care about the access times, then it will surely be cheaper per gigabyte to store it.

Maybe we should give up on magnetic altogether and just store slow data on blu-ray like Facebook started doing!

Convert JSON to CSV with jq and RecordStream

JSON is all the rage these days. XML and CSV are deeply uncool.

Jazzy or Jobsworth?

There are good reasons for JSON’s popularity as an interchange format, but right now you don’t care. You just want the data in a form understood by your tried-n-trusted tools.

Some trendy API is spraying out jazzy JSON at you, but you need something flat and tabular so you can populate your boring, business-critical spreadsheets and relational databases.

Sometimes you just want good old fashioned CSV. How are you gonna get it?

Say you want to import the list of metropolitan areas that Last.fm produces charts for. (I’m actually doing this for a project.)

Fetching the data from the Last.fm’s API is easy enough. Just make a web request with cURL and save the response to a file.

curl "http://ws.audioscrobbler.com/2.0/?method=geo.getmetros&api_key=7570768adbf999b04fadb54aa0548a96&format=json" \
> lastfm_metros.json

What did you get?

cat lastfm_metros.json

Yuck! How can you even read this, never mind work with it in a spreadsheet!?

{"metros":{"metro":[{"name":"Melbourne","country":"Australia"},{"name":"Adelaide","country":"Australia"},{"name":"Sydney","country":"Australia"},{"name":"Linz","country":"Austria"},{"name":"Graz","country":"Austria"},{"name":"Vienna","country":"Austria"},{"name":"Salzburg","country":"Austria"},{"name":"Innsbruck","country":"Austria"},{"name":"Minsk","country":"Belarus"},{"name":"Li\u00e8ge","country":"Belgium"},...

At this point you might be tempted to throw Perl or Python at the problem.

If you’re like me, you don’t like to write code when there’s already a tool out there that solves the problem.

Are there any tools out there that solve this problem? Yes!

The first is jq, a command-line JSON processor. It’s great for exploring and reshaping JSON.

If you use Ubuntu, you can install it as an apt package.

sudo apt-get install jq

The second tool is recs-tocsv, part of the RecordStream suite. It simply converts a certain shape of JSON to CSV.

RecordStream is written in Perl, so you can install it as a Perl module.

sudo cpanm -i App::RecordStream

With these tools you can solve the problem in three steps.

Step 1: Use jq to reformat the messy response so you can actually read it and figure out its structure.

Step 2: Use jq to transform the complicated response into a simpler intermediate form.

Step 3: Use recs-tocsv to turn the intermediate form into CSV, optionally specifying a column order.

Let’s get started with jq to complete step 1.

The simplest useful thing that jq does is pretty-printing. Just pipe some JSON into it and use the dot filter.

cat lastfm_metros.json |
jq .

The dot filter is jq’s identity transformation. Logically the output is identical to the output, but jq writes it out neatly with whitespace and colors so you can actually read it.

{
  "metros": {
    "metro": [
      {
        "country": "Australia",
        "name": "Melbourne"
      },
      {
        "country": "Australia",
        "name": "Adelaide"
      },
      {
        "country": "Australia",
        "name": "Sydney"
      },
...

Is it clearer now? The response is basically a list of country-metro pairs, but each pair is wrapped in an object, in a list, in an object, in an object.

The final output is obviously now a CSV file with two columns.

Let’s help recs-tocsv to help us by creating the the form that it works best with. This is step 2.

recs-tocsv prefers a stream of flat linear JSON objects as input.

Stream means the output should be one JSON object per record; not one big object. In this example, one metro is one record.

Flat means the keys of each object should refer only to scalar values; no lists or other objects. The metro objects are already flat.

Linear means each object should be output on one line of text; line breaks shall delimit records. The original response happens to be linear.

You can create a record stream with a sequence of jq filters. Just like how bash allows you to pipe cat into jq, so jq allows you to pipe one filter into the next.

cat metros.json |
jq ".metros | .metro | .[]"

The filter .metros outputs the value of the metros key, which is another object. .metro outputs the value of the metro key, which is a list. .[] outputs all the list elements as seperate values.

With this sequence of filters, jq produces a flat stream.

{
  "country": "Australia",
  "name": "Melbourne"
}
{
  "country": "Australia",
  "name": "Adelaide"
}
{
  "country": "Australia",
  "name": "Sydney"
}
...

To make the stream linear, switch on “compact” output.

cat ./responses/metros.json |
jq ".metros | .metro | .[]" --compact-output

Compact mode removes all the unnecessary whitespace, except for newlines at the end of records.

{"country":"Australia","name":"Melbourne"}
{"country":"Australia","name":"Adelaide"}
{"country":"Australia","name":"Sydney"}
{"country":"Austria","name":"Linz"}
{"country":"Austria","name":"Graz"}
...

Now you have the correct intermediate form, step 3 is gonna be simple.

Step 3: pipe the output into recs-tocsv, and save to a file.

cat ./responses/metros.json |
jq ".metros | .metro | .[]" --compact-output |
recs-tocsv

Ta-da! CSV at last.

country,name
Australia,Melbourne
Australia,Adelaide
Australia,Sydney
Austria,Linz
Austria,Graz
Austria,Vienna
Austria,Salzburg
Austria,Innsbruck
Belarus,Minsk
Belgium,"Liège"
...

If the order of columns doesn’t matter to you, then you’re done!

Bulk insert operations into a relational database often expect a certain column order in the flat file source.

If you need to swap the two columns around, use the key parameter.

cat ./responses/metros.json |
jq ".metros | .metro | .[]" --compact-output |
recs-tocsv --key name,country

The key parameter takes a comma-seperated list of key names in the JSON source, and basically looks just like the header of the CSV file.

name,country
Melbourne,Australia
Adelaide,Australia
Sydney,Australia
Linz,Austria
Graz,Austria
Vienna,Austria
Salzburg,Austria
Innsbruck,Austria
Minsk,Belarus
"Liège",Belgium
...

All done!

Creating MSMQ queues with PowerShell

Today I was fixing a troubled installation of an inherited ETL framework.

Two required message queues were missing from the environment. In fact, MSMQ was not even installed on the server.

I installed MSMQ, but “Message Queuing” was still missing from the “Computer Management” interface. How was I gonna create the queues now?

You guessed it: PowerShell.

[Reflection.Assembly]::LoadWithPartialName('System.Messaging')
$msmq = [System.Messaging.MessageQueue]
$msmq::Create('.private$etl_notifications', $true)
$msmq::Create('.private$etl_transform_tasks', $true)

Line 1 loads the System.Messaging assembly, the .NET interface to MSMQ.

Line 2 makes the next lines easier to read. The MessageQueue class has static methods to manage queue lifecycle. Now we can refer to it as just $msmq.

Lines 3 and 4 call MessageQueue.Create to do the actual work of creating the queues etl_notifications and etl_transform_tasks.

The framework uses private queues, so the queue names are prefixed with .private$. The $true means it’s a transactional queue.

The weird thing is that “Message Queuing” appeared in the “Computer Management” interface after I added the queues.

msmq_computer_management

Had that happened in the first place, I would never have figured out the PowerShell way! So, thanks, buggy GUI, I guess.

Even more thanks to Jainath V R for his step-by-step PowerShell guide!

How do you search a different active directory domain?

Sometimes you’ll see a a service account in SQL Server that you can’t easily find in Active Directory.

Say you want to find the service account for processing Adverts.

$ Get-ADUser -Filter "Name -like '*Advert*'" | Select Name

No results. Damn!

This was frustrating until someone reminded me that it was probably outside the corp domain that holds the mostly human users, like me. Your domain is the default domain for the AD cmdlets.

So how do you search other domains?

Use Get-ADForest to list all the domains in your forest.

$ (Get-ADForest).Domains
api.cloudcorp.local
corp.cloudcorp.local
dev.cloudcorp.local
int.cloudcorp.local
prod.cloudcorp.local

Use the -Server parameter of Get-ADUser to override the default domain value. It’s oddly named, but it’s basically synonymous with Domain. (It actually refers to an instance of Active Directory Domain Services.)

If you want to search all the domains, just set up a pipeline.

Select UserPrincipalName at the very end to distinguish the different domains.

$ (Get-ADForest).Domains | % { Get-ADUser -Server $_ -Filter "Name -like '*advert*'" } | Select UserPrincipalName

UserPrincipalName
-----------------
svc_advert@dev.cloudcorp.local
svc_advert@int.cloudcorp.local
svc_advert@prod.cloudcorp.local
svc_advert@test.cloudcorp.local

Thanks to Steve Mahoney on the PowerShell.com forum for explaining this.

Windows 7 Desktop Bug Renames Every File

Sometimes when you create a folder on the desktop, Windows 7 does what you ask, but aks bugs out, asks a stupid question, and then renames every file on your desktop.

stupid question

The bug is that the folder name gets applied to every item on the desktop. The silly part is that it freezes the desktop while it warns that you can’t rename the Recycle Bin.

The error text is:

An unexpected error is keeping you from renaming the folder. If you continue to receive this error, you can use the error code to search for help with this problem.

Error 0x80004001: Not implemented

everything got renamed

You can undo the bug at the expense of losing the name of the new folder, so it’s not disastrous. It’s just stupid.

undo rename

Filing this here because Microsoft I don’t know where to file public bug reports for Windows 7.

When I did search for help on the error code, I found a Technet post describing the same issue. No reason or solution was proposed.

Xubuntu Remote Desktop

At Sand Port we made a media center out of my Xubuntu ThinkStation. Now we have an easy central place for listening to tunes and watching fireplaces.

We’re a pretty lazy bunch, and often fiddling with laptops while something is on the TV. Wouldn’t it be great if we could control the media center without even lifting our hands from the keyboard?

I want to make it easy for others, so setting up an RDP server seemed like the best solution. Windows has a built-in RDP client so my flatmate wouldn’t have to install any software.

To make this work in Xubuntu I used xrdp and vino on the server, and on the testing client I used nmap, freerdp and remm1ina.

Mapping the network

The first step is find the media center from my laptop.

Use nmap -sn (ping scan) to find hosts on the local network.

$ nmap -sn 192.168.0.1-254

Starting Nmap 6.40 ( http://nmap.org ) at 2014-04-01 21:41 BST
Nmap scan report for 192.168.0.1
Host is up (0.020s latency).
Nmap scan report for 192.168.0.6
Host is up (0.043s latency).
Nmap scan report for 192.168.0.10
Host is up (0.000067s latency).
Nmap done: 254 IP addresses (3 hosts up) scanned in 3.38 seconds

http://linuxcommando.blogspot.co.uk/2008/05/ping-or-nmap-to-identify-machines-on.html

Three IPs: 1 is the router, and 6 and 10 are my media center and laptop. Which way round, though?

I ran ifconfig at the media center to find out its own IP address.

$ ifconfig
eth0      Link encap:Ethernet  HWaddr 00:21:86:fa:f0:45  
          inet addr:192.168.0.5  Bcast:192.168.0.255  Mask:255.255.255.0
[...]

The output tells me I can use 192.168.0.5 to refer to it on the local network.

Enable RDP on the server

Setting up the actual RDP server is as simple as installing a package.

sudo apt-get install xrdp

The default port for the RDP protocol is 3389.

Check just this port using nmap on the laptop.

$ nmap 192.168.0.5 -p 3389

Starting Nmap 6.40 ( http://nmap.org ) at 2014-04-01 22:25 BST
Nmap scan report for 192.168.0.5
Host is up (0.0039s latency).
PORT     STATE    SERVICE
3389/tcp filtered ms-wbt-server

Nmap done: 1 IP address (1 host up) scanned in 0.49 seconds

Previously I locked down the media center ports using the gufw firewall. I made an exception for all incoming connections on port 3389.

gufw rule

Now the port is open.

$ nmap 192.168.0.5 -p 3389

Starting Nmap 6.40 ( http://nmap.org ) at 2014-04-01 22:45 BST
Nmap scan report for 192.168.0.5
Host is up (0.0031s latency).
PORT     STATE SERVICE
3389/tcp open  ms-wbt-server

Nmap done: 1 IP address (1 host up) scanned in 0.49 seconds

Start a new RDP session

Install freerdp on the laptop. It’s a command line RDP client.

sudo apt-get install freerdp-x11

Use freerdp to connect to the media center on the default port.

xfreerdp 192.168.0.5

Got a log in screen. So far so good.

freerdp login

Log in as sandport.

Login appears to be successful, but all I see is a blank screen. Rubbish.

blank remote desktop

You have to put the name of the desktop manager in a file called .xsession in the sandport home directory.

echo "xfce4-session" > .xsession

http://c-nergy.be/blog/?p=4448

Try again. Success!

remote desktop success

Some of the icons look wrong, but I can live with that.

The main issue is that this actually creates a new desktop session. What I really want to do is share control of the existing desktop so I can queue stuff up on Spotify.

Sharing the main desktop

Ubuntuwiki has a guide to desktop sharing with Xrdp that contains almost everything I needed.

http://askubuntu.com/questions/235905/use-xrdp-to-connect-to-desktop-session

http://ubuntuwiki.net/index.php/Xrdp,_installing

Back to the server to install vino, a desktop sharing server for VNC. This works because xrdp actually uses VNC on the server and talks to clients using RDP.

sudo apt-get install vino

Unfortniately I saw this error because Vino doesn’t start automatically on XFCE.

“connecting to 127.0.0.1:5900 error – problem connecting”

To make it start in XFCE you have to add XFCE to the list of desktops in the autostart file.

The autostart file is here:

/etc/xdg/autostart/vin-server.desktop

You have to change the line with OnlyShowIn to look like this:

OnlyShowIn=GNOME;Unity;XFCE

To check that it worked, restart XFCE and inspect the output of netstat -antp
for an instance of vino server listening on port 5900.

For convenience rearrange the desktop options in /etc/xrdp/xrdp.ini so that the main desktop (console) is at the top. Make the username blank so that all you hve to type is the password.

[xrdp1]
name=console
lib=libvnc.so
username=
password=ask
ip=127.0.0.1
port=5900

Use Remmina for everyday RDP use in Xubuntu. It’s like the best of the Windows built-in client and RdpMan. You can save connection settings and you get the floating menu when you are connected.

Remmina’s awesome feature is thgat it automatically scales the desktop to fit your screen. Useful if your main desktop is on a widescreen TV!

The Windows client actually supports this too, but it’s hidden. Right click on top-left icon and choose “smart sizing” to fit the large screen into the smaller one.

Gotchas

Getting remote desktops (not shared) was enough of an acheivement, so I plaued about with those for a while.

Evnetually youĺl get this message if you keep not logging out properly.

xrdp_mm_process_login_response: login failed

http://linuxtoolkit.blogspot.sg/2013/03/xrdpmmprocessloginresponse-login-failed.html

I followed the advice of Linux Toolkits to delete old X sessions and restart the xrdp.

Still no joy.

Looked at -var-log-xrdp-sesman.log and saw that it still thought it had ran out of displays.

Linux Toolkits encountered this too

http://linuxtoolkit.blogspot.co.uk/2013/09/x-server-no-display-in-range-is.html

Instead of just upping MaxSessions to 100, I reset the X11DisplayOffset counter to 1 and restarted the server.

http://linuxtoolkit.blogspot.co.uk/2013/09/x-server-no-display-in-range-is.html

At some point something messed with the ownership of an .Xauthority file.

http://askubuntu.com/questions/350692/xubuntu-13-04-x-login-loop

I saw messages like this when I was trying to run “gksudo mousepad”:

“Failed to run usr/sbin/synaptic as user root
Unable to copy the user’s Xauthorization file.”.

http://ubuntuforums.org/showthread.php?t=898233

I thought it was just Xubuntu being weird about something.

But when I restarted the media center it prompted me for a password, even though I asked it not to.

And when I gave it the password it just asked me again, and again, and again.

CTRL + ALT + F2 got me to the emergency shell and I could log in there.

sudo chown sandport:sanport .Xauthority

So I learned a lot of useful stuff, and gained some appreciation for how well tested Windows is!

How to set up Xubuntu

Install Whisker Menu

The default application menu is fine with the mouse, but hard to use with the keyboard.

The Windows 7 start menu works well with both. On the keyboard you just press WIN, type to filter programs, files, settings, whatever, and select the one you want. Easy!

You can get close to that that level of simplicity by installing Whisker Menu.

Whikser Menu will probably ship with Xubuntu 14, but until then you have to install it yourself.

Use these commands to install Whikser Menu.

sudo add-apt-repository ppa:gottcode/gcppa
sudo apt-get update
sudo apt-get install xfce4-whiskermenu-plugin

Follow these steps to replace the old Application Menu.

  • Right click on the top menu and choose Panel, Add New Items.
  • Find Whisker Menu in the list of items.
  • Drag and drop Whisker Menu next to the application menu.
  • Right click on the Application Menu and choose remove. Click Remove again in the dialog to confirm.

Now create a keyboard shortcut so you can access the menu from the keyboard. You can use Whisker Menu to bootstrap this!

  • Click the Whisker Menu.
  • Type ‘set’ and enter to open Settings.
  • Go to Keyboard, Application Shortcuts, Add.
  • Put xfce4-popup-whiskermenu as the command.
  • Set WIN + space as the shortcut.

You could set just WIN as the shortcut, but then the Whisker Menu would pop up whenever you use any other shortcuts that combine WIN and another key.

Xfce is a bit too simple here to emulte Windows exactly.

MRU tab switching in Firefox

You can flick through tabs in Firefox just like you flick through applications on the desktop. The shortcut is ALT + TAB.

By default, Firefox cycles through the tabs from left to right. I generally only care about the current tab and the MRU (most recently used) tab.

Browse to about:config and acknowledge the warning that here dragons be.

Set browser.ctrlTab.previews to true.

Easy Window Tiling

Enable Windows-style tiling by assigning keybaord and mouse shortcuts to the tiling functions.

Go to Applications, Settings Manager, Window Manager.

In the Advanced tab, uncheck ‘Wrap workspaces when dragging a window off the screen’.

Go to the keyboard tab.

Double click on ‘Tile window to the left’ and press SUPER + left.

Double click on ‘Tile window to the right’ and press SUPER + right.

In Linux, WIN is called SUPER.

Start menu

whistker menu

Version /etc

Use etckeeper to keep an audit trail of /etc (system configuration) changes.

sudo apt-get install mercurial

Install etckeeper.

sudo apt-get install etckeeper

Edit the etckeeper config.

sudoedit /etc/etckeeper/etckeeper.conf

Set the VCS to mercurial (hg).

# The VCS to use.
VCS="hg"
#VCS="git"
#VCS="bzr"
#VCS="darcs"

Make the inital commit.

sudo etckeeper init
sudo etckeeper commit "Initial commit."

etckeeper automatically commits after apt activity and on a daily cycle.

Set multi monitor layout

Use arandr to get a tool very similar to Windows 7’s built-in multi monitor layout editor.

Install arandr.

sudo apt-get install arandr

Arrange the layout visually, just like on Windows 7.

arandr-ell-layout

Thanks to Jeremy L Gaddis and Alin Andrei for the tips.