What to do if OneDrive for Business keeps restarting

After a recent company-wide update to our work computers, OneDrive for Business started behaving stragely.

It just wouldn’t leave me alone! Every time I started my computer, it would repeatedly pop up an explorer window showing my local copy of my files in OneDrive for Business. All the files were showing a sync error.

The OneDrive for Business folder would repeatedly pop up whnever I started my computer. All the items show a sync failure.

Hot cross buns of death

Not only did the sync break, but the system tray broke as well. Hundreds of copies of the OneDrive for Business icon and the Office Sync Center icon were being left behind in the system tray.

The system tray contains 95 icons, 56 of them are OneDrive for Business. 30 of them are Office Sync Center.

The tray is eating my screen

And I literally mean hundreds, if you leave it long enough.

There is not enough room to show all of the items in your system tray. Please uninstall some programs, or try a higher screen resolution.

That many arrows reminds me of Kye http://games.moria.org.uk/kye/

The icons in the tray would disappear when I moused over them. It looks like that was a fatal combination of a system tray bug and an OneDrive for Business bug.

The tray problem was more annoying than the sync because it would hide all the real tray icons from me. If I didn’t “farm” the tray icons every twenty minutes or so, the icons would effectively be completely hidden from me.

In the Task Manager I saw that there were several instances of MSOSYNC.EXE (Office Sync Center) and GROOVE.EXE (OneDrive for Business). The instance seems to be short-lived, because when I tried to kill them through the GUI, sometimes it would say that they were already gone.

While I waited for help from Corporate IT Services, I hacked this PowerShell script together, and left it running in a minimized shell to keep the tray in a usable state.

while ($true) {
  gps msosync, groove -ea silentlycontinue | spps;
  sleep -milli 500

Every 500 milliseconds, it stops any instances of OneDrive for Business and the Office Sync Center.

I observed that new instances were starting about every second or so. Killing the instances more frequently than that ensured that there was no build up of dead icons in the tray.

Of course, this was only hiding the tray problem, and didn’t address the sync problem.

A colleague scoured the Office 365 forums and eventually found the answer. You may have guessed already, but it was an issue with cache.

Deleting the cache and resyncing all the files fixed the issue.

These instructions are due to Maggie Li on the Office 365 community forum.

1. Backup all the local documents & files in your document libraries to another place.

2. After the backup work is done, remove the original documents & files in your local OneDrive for Business folder.

3. Clear cache for OneDrive for Business. Delete the OneDrive for Business Spw and 15.0 folders. To do this, follow these steps:

1). Open Task Manager (you can open Task Manager by pressing Ctrl+Shift+Esc). Make sure none of the following process is running. If not, end them one-by-one.


2). Give yourself a Windows administrator role as follows:

Click the Start button and type cmd in the search bar. When cmd.exe appears in the results, right-click it and select Run as Administrator.

3). At the command prompt, delete the Office file cache and Spw and 15.0 folders by issuing four commands as follows:

Type “cd %USERPROFILE%\AppData\Local\Microsoft\Office\15.0\” and then press the Enter key.
Type “rmdir OfficeFileCache /s” and then press the Enter key.
Type “cd %USERPROFILE%\AppData\Local\Microsoft\Office\” and then press the Enter key.
Type “rmdir Spw /s” and then press the Enter key.

4) If you get an error when executing either rmdir command, one of the .exe processes is probably still running. Correct the problem by returning to the Task Manager (step 3-1), stopping the processes, and then removing the directories as described previously.

4. Sync the library again.

The instructions worked, and my files are in sync again!

For reference, here’s exactly what worked for me.

I followed steps 1 and 2 exactly as Maggie described.

Before starting step 3, I restarted Windows in safe mode. I had to do this because csisyncclient.exe, msosync, and groove.exe would keep restarting even if I stopped them. In safe mode they do not start automatically.

One file in the OfficeFileCache folder caught my eye. It was an access database called CentralTable.accdb, and it had swelled to over 2GB. As I write this, now the problem is fixed, it is around 200MB in size.

After completing step 3, I restarted Windows in normal mode.

When I logged in, the Office Upload Center appeared with an error message.

OneDrive for Business found a problem while accessing the Microsoft Office Document Cache and needs to repair it before it can continue.

As part of the repair a copy of the cache will be saved as a backup and a new cache will be created.

When I clicked Repair, I got another error message.

The action cannot be completed because another application is using the Microsoft Office Document Cache.

Please close all Microsoft Office applications of restart your computer and try again.

I clicked Try Again, but the same message appeared.

I clicked cancel.

I noticed OneNote was running in the tray, so I closed it to be sure.

I started the Upload Center again. It started normally with this message:

No files are pending upload

I checked in the Task Manager. There was only one instance each of CSISYNCCLIENT.EXE, GROOVE.EXE, and MSOSYNC.EXE. They were all running normally.

My OneDrive for Business folder was still empty, and the version on Sharepoint was still populated.

On Sharepoint in Internet Explorer I clicked “Sync” to start the sync process.

It took a while, but eventually all my files appeared locally and were synced with the server version.

The only problem is that OneDrive for Business has created a new folder to sync to.

The new folder is called

C:\Users\iain\OneDrive – Company Ltd 1

The old folder, which is still empty, is called

C:\Users\iain\OneDrive – Company Ltd

I think I can live with that for now.

Thanks for your help, Maggie!

SQL Saturday Exeter 2015 Notes

SQL Saturday Exeter was great fun and a good learning experience. If you are a data professional working with SQL Server, you should get yourself along to the next one in 2016 and get involved!

This is not polished enough to be considered a full review of SQL Saturday Exeter. It’s just my mental notes committed to page before I forget. I might tidy this up later if I get time :-)

DLM Training Workshop – Automated Database Deployment

Friday was a full day’s workshop on using Red Gate’s new solution for Database Lifecycle Management. Part 3 of a trilogy (part 1 is source control, part 2 is continuous integration).

This session was the compelling business reason to come to Exeter. At work we are planning the next phase of our automatic database deployment strategy. We already trust Red Gate’s database development tools, and we want to build on that investment.

It’s early days, but to me it looks like SQL Release is a winning product, and the missing piece of our database deployment puzzle.

I hope to write more about this soon :-)

Pirate Party

Yarr! Plastic dreads, gutrot rum, comical accents, gold (chocolate) coins, and FOOT JENGA!

Phil Factor – Spinach and Database Development (Keynote)

“What has spinach got to do with database development?”

As it turns out, they both have data quality issues in common. People believed for generations the bad data about the iron content of spinach.

Your most important job as a database professional is to defend against bad data. If you don’t do this, anything else you do will just be helping to deliver the wrong answer more quickly.

The keynote was delivered as a video and published online, so you can watch it again from the comfort of your own bed! At just six minutes long, you could even watch it over a coffee break.

Richard Douglas – Top Down Tuning

What can you do to tune SQL Server when you don’t have control over the database code? Actually quite a few things.

Richard reminded us that the server’s main job is to respond to requests for data. That data can be retrieved from three physical places: CPU cache, memory, and disk. Each subsequent location is roughly an order of magnitude slower to access than the one before. (Read Jeff Atwood‘s post on relative latencies for a great general overview of the fundamental problem).

We had an overview of the server-side lifecycle of a database query, from initial receipt of the SQL query to the final submitting of a TDS (tabular data stream) response.

A key stage of this lifecycle is when SQL Server looks in the plan cache.

Richard showed us how to reduce cache bloat with a server configuration option for storing a “stub” plan the first time around.

He pointed out that the query text must match exactly to the text stored in the cache for the cached plan to be usable. Coding standards might seem onerous, but in this case they may relieve memory pressure!

You can monitor the plan cache with an odd but effective query from Jonathan Kehayias. Odd because it use the WITH XMLNAMESPACES clause (new to me!) and XQuery, effective because you can pull out missing index information from the each XML plan into a single database table.

A related trick is that we can append OPTION(RECOMPILE) to the end of our diagnostic queries to stop them being stored in the plan cache. This defends against bloating the cache and also pushing out more important plans.

SQL Server has a threshold at which any query may be run in parallel. In theory parallelism can increase performance, but in practice the server’s default threshold is “not fit for purpose” for today’s workloads.

Before you change it, you can review the cost of the cached plans to find the ones whose plans may change.

Alex Yates – First Steps in Continuous Integration for Databases with Red Gate tools

A one-hour version of parts one and two of the DLM sessions. Alex discussed some of the reasons that database lifecycle management is so hard, and demoed some Red Gate tools that remove some of the major pain points.

Some of the issues in DLM are inherent to the fact that databases contain state that you must keep. You can’t upgrade a database by throwing the old one away and replacing it with the new version.

SQL is a (the only?) strange language in that the person doing the deployment often knows more about the language than the person who wrote the code!

Some of the issues are caused by working culture and poor communication. It boils down to “if it’s not in version control, it doesn’t exist.”

The audience was already comfortable with version control, so we went straight ahead with a demo of versioning, packaging and publishing a database project.

Demo setup: Windows, SVN, TeamCity, SQL Source Control, SQL Test, SQLCI plugin for TeamCity

  • Use SQL Source Control to script out all the database objects into a new folder in SVN
  • Add a new VCS root in TeamCity for the new folder
  • Add a new build step to the existing build configuration that connects to the VCS root, compiles the database code , and packages the database
  • Add a second build step that runs the unit tests in the package

Chris Testa-O’Neill – Cloud for the data professional

If you have been in IT for long enough you will recognise a pattern. The shift from mainframe to client-server was a lot like today’s shift from on-prem to cloud.

The reality for many organizations is today a hybrid model. Some data and applications must stay on-prem for various reasons, but it makes sense for others to go into the cloud simply because it is more cost-effective.

Machine Learning is one of the services offered on Azure (Andrew Fryer’s Machine Learning session was on at the same time). To get the most from machine learning, you need to have a solid understanding of statistics. There are many good books now that approach statics not from a dry mathematical perspective but a practical, data professional perspective. Chris recommended “Statistics for Dummies” as a good starting point.

Chris demoed how to provision SQL Server in the cloud. Once you already have an Azure account, it’s as easy as logging in, filling in a few forms with your system requirements and credentials, and waiting a few minutes for the machine to be provisioned. It looks awesome, and I can’t wait to get the chance to play with this.

Someone asked about the DBA’s role in the future. The blunt answer is “Cloud is no excuse for bad code. With all the time we save on provisioning, now we can focus on adding value through performance tuning”.

William Durkin – Stories from the trenches: upgrading SQL with minimal downtime

I’m halfway through a domain migration project at work, so I came to this session to pick up any tips about upgrades that might also apply in this situation.

The session was an overview of the high-availability and disaster recovery features of SQL Server and how you can use them to implement an upgrade strategy. Backup/restore, log shipping, clustering, replication, and mirroring all have their place.

Unfortunately the demo machine was down (!), but we had some interesting discussion nonetheless.

The best part of this was how to define availability. Yeah, people like to talk about “five-nines availability”, but in what context? Not everyone is a 52 x 24 x 7 business (five mins downtime allowed per year).

If you are more like a 48 x 12 x 5 business, you can measure your five nines within this window, which sounds tighter (seconds of downtime allowed), except that you now have all that out of hours time to do any maintenance work!

And of course, planned, discussed downtime that happens in business hours does not count towards the SLA. The SLA measures unplanned outage time.

Neil Hambly – Load Testing with SQL Server Tools

To complete the domain migration, we’ll need to do load testing to do a before-and-after comparison of performance.

By the end of the day my brain was getting pretty fried from all the new stuff (and sleep loss!) but I managed to hang on long enough to get some good stuff from this one too.

Neil demoed how the Distributed Replay features in SQL Server 2012 allow you to replay captured trace files so that you can recreate the conditions for a particular performance testing scenario.

There is a quite a lot of setup and preprocessing to make distributed replay work with the trace files, and he demoed that as well. There are some command-line tools for the distributed replay client you can use to this.

Post-Session Chat and Chillout

Prize draws, curry, beer, bed!

Also had a interesting chat with people from Micron, an SSD manufacturer. It was their first SQL Saturday. I hope they felt welcome!

We were wondering if there is ever a use case nowadays for spinning magnetic disks over SSDs. All we could think of was, if you are wanting to store the data and don’t really care about the access times, then it will surely be cheaper per gigabyte to store it.

Maybe we should give up on magnetic altogether and just store slow data on blu-ray like Facebook started doing!

Convert JSON to CSV with jq and RecordStream

JSON is all the rage these days. XML and CSV are deeply uncool.

Jazzy or Jobsworth?

There are good reasons for JSON’s popularity as an interchange format, but right now you don’t care. You just want the data in a form understood by your tried-n-trusted tools.

Some trendy API is spraying out jazzy JSON at you, but you need something flat and tabular so you can populate your boring, business-critical spreadsheets and relational databases.

Sometimes you just want good old fashioned CSV. How are you gonna get it?

Say you want to import the list of metropolitan areas that Last.fm produces charts for. (I’m actually doing this for a project.)

Fetching the data from the Last.fm’s API is easy enough. Just make a web request with cURL and save the response to a file.

curl "http://ws.audioscrobbler.com/2.0/?method=geo.getmetros&api_key=7570768adbf999b04fadb54aa0548a96&format=json" \
> lastfm_metros.json

What did you get?

cat lastfm_metros.json

Yuck! How can you even read this, never mind work with it in a spreadsheet!?


At this point you might be tempted to throw Perl or Python at the problem.

If you’re like me, you don’t like to write code when there’s already a tool out there that solves the problem.

Are there any tools out there that solve this problem? Yes!

The first is jq, a command-line JSON processor. It’s great for exploring and reshaping JSON.

If you use Ubuntu, you can install it as an apt package.

sudo apt-get install jq

The second tool is recs-tocsv, part of the RecordStream suite. It simply converts a certain shape of JSON to CSV.

RecordStream is written in Perl, so you can install it as a Perl module.

sudo cpanm -i App::RecordStream

With these tools you can solve the problem in three steps.

Step 1: Use jq to reformat the messy response so you can actually read it and figure out its structure.

Step 2: Use jq to transform the complicated response into a simpler intermediate form.

Step 3: Use recs-tocsv to turn the intermediate form into CSV, optionally specifying a column order.

Let’s get started with jq to complete step 1.

The simplest useful thing that jq does is pretty-printing. Just pipe some JSON into it and use the dot filter.

cat lastfm_metros.json |
jq .

The dot filter is jq’s identity transformation. Logically the output is identical to the output, but jq writes it out neatly with whitespace and colors so you can actually read it.

  "metros": {
    "metro": [
        "country": "Australia",
        "name": "Melbourne"
        "country": "Australia",
        "name": "Adelaide"
        "country": "Australia",
        "name": "Sydney"

Is it clearer now? The response is basically a list of country-metro pairs, but each pair is wrapped in an object, in a list, in an object, in an object.

The final output is obviously now a CSV file with two columns.

Let’s help recs-tocsv to help us by creating the the form that it works best with. This is step 2.

recs-tocsv prefers a stream of flat linear JSON objects as input.

Stream means the output should be one JSON object per record; not one big object. In this example, one metro is one record.

Flat means the keys of each object should refer only to scalar values; no lists or other objects. The metro objects are already flat.

Linear means each object should be output on one line of text; line breaks shall delimit records. The original response happens to be linear.

You can create a record stream with a sequence of jq filters. Just like how bash allows you to pipe cat into jq, so jq allows you to pipe one filter into the next.

cat metros.json |
jq ".metros | .metro | .[]"

The filter .metros outputs the value of the metros key, which is another object. .metro outputs the value of the metro key, which is a list. .[] outputs all the list elements as seperate values.

With this sequence of filters, jq produces a flat stream.

  "country": "Australia",
  "name": "Melbourne"
  "country": "Australia",
  "name": "Adelaide"
  "country": "Australia",
  "name": "Sydney"

To make the stream linear, switch on “compact” output.

cat ./responses/metros.json |
jq ".metros | .metro | .[]" --compact-output

Compact mode removes all the unnecessary whitespace, except for newlines at the end of records.


Now you have the correct intermediate form, step 3 is gonna be simple.

Step 3: pipe the output into recs-tocsv, and save to a file.

cat ./responses/metros.json |
jq ".metros | .metro | .[]" --compact-output |

Ta-da! CSV at last.


If the order of columns doesn’t matter to you, then you’re done!

Bulk insert operations into a relational database often expect a certain column order in the flat file source.

If you need to swap the two columns around, use the key parameter.

cat ./responses/metros.json |
jq ".metros | .metro | .[]" --compact-output |
recs-tocsv --key name,country

The key parameter takes a comma-seperated list of key names in the JSON source, and basically looks just like the header of the CSV file.


All done!

Creating MSMQ queues with PowerShell

Today I was fixing a troubled installation of an inherited ETL framework.

Two required message queues were missing from the environment. In fact, MSMQ was not even installed on the server.

I installed MSMQ, but “Message Queuing” was still missing from the “Computer Management” interface. How was I gonna create the queues now?

You guessed it: PowerShell.

$msmq = [System.Messaging.MessageQueue]
$msmq::Create('.private$etl_notifications', $true)
$msmq::Create('.private$etl_transform_tasks', $true)

Line 1 loads the System.Messaging assembly, the .NET interface to MSMQ.

Line 2 makes the next lines easier to read. The MessageQueue class has static methods to manage queue lifecycle. Now we can refer to it as just $msmq.

Lines 3 and 4 call MessageQueue.Create to do the actual work of creating the queues etl_notifications and etl_transform_tasks.

The framework uses private queues, so the queue names are prefixed with .private$. The $true means it’s a transactional queue.

The weird thing is that “Message Queuing” appeared in the “Computer Management” interface after I added the queues.


Had that happened in the first place, I would never have figured out the PowerShell way! So, thanks, buggy GUI, I guess.

Even more thanks to Jainath V R for his step-by-step PowerShell guide!

How do you search a different active directory domain?

Sometimes you’ll see a a service account in SQL Server that you can’t easily find in Active Directory.

Say you want to find the service account for processing Adverts.

$ Get-ADUser -Filter "Name -like '*Advert*'" | Select Name

No results. Damn!

This was frustrating until someone reminded me that it was probably outside the corp domain that holds the mostly human users, like me. Your domain is the default domain for the AD cmdlets.

So how do you search other domains?

Use Get-ADForest to list all the domains in your forest.

$ (Get-ADForest).Domains

Use the -Server parameter of Get-ADUser to override the default domain value. It’s oddly named, but it’s basically synonymous with Domain. (It actually refers to an instance of Active Directory Domain Services.)

If you want to search all the domains, just set up a pipeline.

Select UserPrincipalName at the very end to distinguish the different domains.

$ (Get-ADForest).Domains | % { Get-ADUser -Server $_ -Filter "Name -like '*advert*'" } | Select UserPrincipalName


Thanks to Steve Mahoney on the PowerShell.com forum for explaining this.

Windows 7 Desktop Bug Renames Every File

Sometimes when you create a folder on the desktop, Windows 7 does what you ask, but aks bugs out, asks a stupid question, and then renames every file on your desktop.

stupid question

The bug is that the folder name gets applied to every item on the desktop. The silly part is that it freezes the desktop while it warns that you can’t rename the Recycle Bin.

The error text is:

An unexpected error is keeping you from renaming the folder. If you continue to receive this error, you can use the error code to search for help with this problem.

Error 0x80004001: Not implemented

everything got renamed

You can undo the bug at the expense of losing the name of the new folder, so it’s not disastrous. It’s just stupid.

undo rename

Filing this here because Microsoft I don’t know where to file public bug reports for Windows 7.

When I did search for help on the error code, I found a Technet post describing the same issue. No reason or solution was proposed.

Xubuntu Remote Desktop

At Sand Port we made a media center out of my Xubuntu ThinkStation. Now we have an easy central place for listening to tunes and watching fireplaces.

We’re a pretty lazy bunch, and often fiddling with laptops while something is on the TV. Wouldn’t it be great if we could control the media center without even lifting our hands from the keyboard?

I want to make it easy for others, so setting up an RDP server seemed like the best solution. Windows has a built-in RDP client so my flatmate wouldn’t have to install any software.

To make this work in Xubuntu I used xrdp and vino on the server, and on the testing client I used nmap, freerdp and remm1ina.

Mapping the network

The first step is find the media center from my laptop.

Use nmap -sn (ping scan) to find hosts on the local network.

$ nmap -sn

Starting Nmap 6.40 ( http://nmap.org ) at 2014-04-01 21:41 BST
Nmap scan report for
Host is up (0.020s latency).
Nmap scan report for
Host is up (0.043s latency).
Nmap scan report for
Host is up (0.000067s latency).
Nmap done: 254 IP addresses (3 hosts up) scanned in 3.38 seconds


Three IPs: 1 is the router, and 6 and 10 are my media center and laptop. Which way round, though?

I ran ifconfig at the media center to find out its own IP address.

$ ifconfig
eth0      Link encap:Ethernet  HWaddr 00:21:86:fa:f0:45  
          inet addr:  Bcast:  Mask:

The output tells me I can use to refer to it on the local network.

Enable RDP on the server

Setting up the actual RDP server is as simple as installing a package.

sudo apt-get install xrdp

The default port for the RDP protocol is 3389.

Check just this port using nmap on the laptop.

$ nmap -p 3389

Starting Nmap 6.40 ( http://nmap.org ) at 2014-04-01 22:25 BST
Nmap scan report for
Host is up (0.0039s latency).
3389/tcp filtered ms-wbt-server

Nmap done: 1 IP address (1 host up) scanned in 0.49 seconds

Previously I locked down the media center ports using the gufw firewall. I made an exception for all incoming connections on port 3389.

gufw rule

Now the port is open.

$ nmap -p 3389

Starting Nmap 6.40 ( http://nmap.org ) at 2014-04-01 22:45 BST
Nmap scan report for
Host is up (0.0031s latency).
3389/tcp open  ms-wbt-server

Nmap done: 1 IP address (1 host up) scanned in 0.49 seconds

Start a new RDP session

Install freerdp on the laptop. It’s a command line RDP client.

sudo apt-get install freerdp-x11

Use freerdp to connect to the media center on the default port.


Got a log in screen. So far so good.

freerdp login

Log in as sandport.

Login appears to be successful, but all I see is a blank screen. Rubbish.

blank remote desktop

You have to put the name of the desktop manager in a file called .xsession in the sandport home directory.

echo "xfce4-session" > .xsession


Try again. Success!

remote desktop success

Some of the icons look wrong, but I can live with that.

The main issue is that this actually creates a new desktop session. What I really want to do is share control of the existing desktop so I can queue stuff up on Spotify.

Sharing the main desktop

Ubuntuwiki has a guide to desktop sharing with Xrdp that contains almost everything I needed.



Back to the server to install vino, a desktop sharing server for VNC. This works because xrdp actually uses VNC on the server and talks to clients using RDP.

sudo apt-get install vino

Unfortniately I saw this error because Vino doesn’t start automatically on XFCE.

“connecting to error – problem connecting”

To make it start in XFCE you have to add XFCE to the list of desktops in the autostart file.

The autostart file is here:


You have to change the line with OnlyShowIn to look like this:


To check that it worked, restart XFCE and inspect the output of netstat -antp
for an instance of vino server listening on port 5900.

For convenience rearrange the desktop options in /etc/xrdp/xrdp.ini so that the main desktop (console) is at the top. Make the username blank so that all you hve to type is the password.


Use Remmina for everyday RDP use in Xubuntu. It’s like the best of the Windows built-in client and RdpMan. You can save connection settings and you get the floating menu when you are connected.

Remmina’s awesome feature is thgat it automatically scales the desktop to fit your screen. Useful if your main desktop is on a widescreen TV!

The Windows client actually supports this too, but it’s hidden. Right click on top-left icon and choose “smart sizing” to fit the large screen into the smaller one.


Getting remote desktops (not shared) was enough of an acheivement, so I plaued about with those for a while.

Evnetually youĺl get this message if you keep not logging out properly.

xrdp_mm_process_login_response: login failed


I followed the advice of Linux Toolkits to delete old X sessions and restart the xrdp.

Still no joy.

Looked at -var-log-xrdp-sesman.log and saw that it still thought it had ran out of displays.

Linux Toolkits encountered this too


Instead of just upping MaxSessions to 100, I reset the X11DisplayOffset counter to 1 and restarted the server.


At some point something messed with the ownership of an .Xauthority file.


I saw messages like this when I was trying to run “gksudo mousepad”:

“Failed to run usr/sbin/synaptic as user root
Unable to copy the user’s Xauthorization file.”.


I thought it was just Xubuntu being weird about something.

But when I restarted the media center it prompted me for a password, even though I asked it not to.

And when I gave it the password it just asked me again, and again, and again.

CTRL + ALT + F2 got me to the emergency shell and I could log in there.

sudo chown sandport:sanport .Xauthority

So I learned a lot of useful stuff, and gained some appreciation for how well tested Windows is!