We are currently living in “Interesting Times”. While this may be good news for journalists and the 24×7 rolling news cycle, it’s not necessarily good news for those of us who use Azure or, indeed, do any kind of international trade.
You may be tracking your costs in Azure – you should be tracking your costs in Azure, and if you want, we [Coeo] can help with that, but that’s not the point of this article.
You may be trying to stabilise your costs and budget by using longer-term leases by using Azure Reservations. This makes sense if you know you’ll need that resource for that time, and it can give a significant discount, but it’s not a guarantee of costs.
But We Have Reserved Instances
So you signed up for a three year deal to keep costs down and more predictable. But the real world doesn’t work like that. You’ll be OK if you’re paying in USD, but those of us in the rest of the world – that’s about 95-96% of the planetary population – we’re subject to the vagaries of exchange rates when buying any Azure service.
A Blast from the Past
Cast your minds back to the tail end of 2022. We had a query come in from a client who had wanted to know why their reserved instance pricing had gone up by 10% for October, and then gone back down again in November.
Microsoft Azure pricing is in USD. Looking at the dollar costs, we can see only the slight variations expected thanks to the hourly cost being multiplied by the different numbers of hours in a month:
When converted into local currency (GBP) for billing, however, we see a different story:
Most months, the price seems broadly similar. October, however, a spike, and back to expected levels in November.
Lettuce Learn From This
What happened? Why did the price suddenly go up and then drop back down?
This is a combination of how MS’s pricing is calculated, and the impact of politics on the mundane.
Microsoft’s pricing for Azure services is in US Dollars. This is then changed monthly to a local currency for billing. From the Azure pricing FAQ:
What exchange rate will be used and how does it work with my bill?
We use London closing spot rates that are captured in the two business days prior to the last business day of the previous month. If the two business days prior to the end of the month fall on a bank holiday in major markets, the rate-setting day is generally the day immediately preceding the two business days. This rate applies to all transactions during the upcoming month and will be applied to all Azure purchases and all Azure consumption in that month.
(In 2022, they used Thomson Reuters benchmark ricing, but the effect was the same.)
Those who raised an eyebrow at the heading of this section may have already figured out why this particular price hike happened. In September 2022, there was a mini-budget in the UK that was, shall we say, less than favourably received by the international markets. This caused the UK pound to fall at just the wrong time of the month for those of us with Azure bills to pay.
That peak high rate lasted only a few days, but it was enough and at the wrong time, and so Azure prices went up.
Interesting Times
Although “May you live in interesting times” may not actually be an ancient Chinese curse, the sentiment is there. And I, for one, would rather not live in interesting times, if only to avoid having to have this sort of conversation with clients…
PS
This blog is the original version of the post I submitted to the Coeo corporate blog. They made a few minor edits – can you see the differences?
Sorry for being a bit quiet of late – almost all of my work-related blogging energy has been directed at both the official Coeo blog, and our internal wiki.
Some of the external Coeo blog posts I’ve written:
What’s life like working at Coeo? This was my first official blog post at Coeo, and was started a few months after I joined. I ran the first draft past a now-colleague before he joined, and he found it useful, and then didn’t get round to actually pushing it to the fore for another few months. The colleague? He’s still with us.
Is it worth tuning that query? This was a big win from a tiny query – tuning a 2 millisecond query can be worth the time. Betteridge’s law of headlines does not apply.
VMWare and SQL Server – The Basics. This one caught the attention of some high-up at VMWare. Oops. Mind you, nothing I wrote here is inaccurate, and the number of places that still haven’t implemented these basic recommendations is too high.
The reason I write this now? Find out next week – the answer will shock you! No, shouldn’t put the clickbaity stuff at the end…
…and I really ought to tidy up the sidebars. Another time, another time…
Yeah, it’s a bit quiet here. That’s because I’ve been writing at work, and one of the things I’ve written has been published here at the Coeo official blog:
20-mumble years ago, I was working on a multi-year multi-million dollar multi-national project. OK, I wasn’t there as a DBA, but SQL Server was involved, so I’m claiming it is suitable for this.
The project was intiated in the usual way. A $2M budget was set. A team was assembled within the company, and some consultants were brought in from outside to do some of the particularly niche techie bits. (I was one of those external bods.)
I arrived to set up that particularly techy back-end bit, and to write the code around it. Which I did – in a way that was easy to document and easy to extend. After a few months, I had done what I needed to do, and handed over my bit to a new guy at the client’s firm, and went on my way to other projects. The rest of the project was for Other People to deal with, and the aim was that it would be done in six months or so. Yeah, right. You can see where this is going.
Time passes…
Two years later, I was back on that client site. The guy I had handed over to had changed virtually nothing – “about six lines of code” – because he hadn’t needed to.
The project still hadn’t been delivered.
Thorin sits down and starts singing about gold…
There had been a bit of a hoo-hah, apparently, and the sort of project meeting that may have involved having to mop blood off the floor – I couldn’t comment, as I wasn’t there. However, during those times, the project had gone through $7.5M of its original $2M budget. And a gap analysis on the project had identified a need for a further $3.5M to get the project over the line.
The project was cancelled. I was back on site to wrap up and document everything that we had done.
I assume some heads rolled. I would be surprised if they didn’t.
Apologies
My apologies for any lack of coherence – I’ve just spent the last five hours on the road, after a full day of work, and only saw the invitation quite late in the evening… excuses, excuses…
OK, so I’m jumping on the Steve Jones / Brent Ozar bandwagon of explaining why I’m not attending SQLBits this year…
Why am I not going? Covid.
No, not because I’ve got Covid, or I’m going to have Covid, or am suffering from Long Covid, or caught Covid at the last SQLBits I attended, or am afraid of catching Covid. And yet Covid is still the reason.
Me, 2020, Music, and I
As some of you may know, I’m also a musician of sorts. I play double bass well enough that local orchestras ask me to dep on short notice, and I’ve played in quite a few operas over the years. I even blog about it sometimes as “TheLoneDoubleBass“. I enjoy doing this, I regularly get booked up a year or more in advance for some of these things, and this was the case back in 2020.
The West Yorkshire Symphony Orchestra had, in the years running up to Covid, been putting on occasional semi-staged operas. These have included Traviata, Cavelleria Rusticana, and Cosi Fan Tutte. In 2020, on the last weekend in March (or was it the one before?) we were due to perform Puccini’s “Le Villi”, and the usual final few weeks’ preparation work were well under way.
And then Covid lockdown happened – it kicked in a few days before our performances were scheduled.
Opera’s not a trivial thing to put on – musicians, singers (I love that distinction), venues, everything – it can be complicated to rearrange. People move on – venues move on – and getting things back together takes time.
That time has now passed, and WYSO is putting on another opera that weekend. It was going to be Le Villi, because why not reuse the effort that’s already been put into rehearsing it, but there are issues that make that currently not a viable option, so we’re teaming up with Northern Opera Group to do two performances of Parry’s short opera “Pied Piper of Hamelin”.
So I’m doing that instead.
It’s not the first time I’ve done an opera rather than attending SQLBits, but there’s only so many weeks in the year, and it’s quite often a case of who books me first. And, as I said, I can get booked up for gigs waaaay ahead of time.
So, I hope you all going to SQLBits have a lovely time, and I hope that I can make it next year. Or maybe I’ll be doing a different gig…
When I started out as a full-time DBA back in 2010, it was the first Q&A site I found that was active, friendly, and easy to use.
It differs from the main SQLServerCentral Q&A boards by not being broken down into questions relating to a particular version of SQL Server, and a particular aspect of that version. This makes it easier for someone to just ask a question without worrying about if their question is going to the right place. A great benefit to those new to the scene.
It’s been through various hosting platforms – I dimly recall that it was hosted for a while on one of the Stack environments, but, again, for Reasons, it was moved elsewhere.
I met some great people through AskSSC. The first time I went to SQLBits, for example, the first two people I met there were two of the stalwarts – Jonathan Allen, and Kev Riley. I later worked with Jonathan on the committee for SQL Southwest’s SQLSaturday events.
Somehow I found myself on the moderatorial team of AskSSC, and slowly crawled my way up the charts of users there – but still only in 4th place behind Jonathan, Kev, and, erm, Grant Fritchey. Hail the mighty ScaryDBA. (He’s not been on the site for over 4 years, tsk.)
I’ve stepped away from answering questions there a couple of years ago, manly because that’s now part of the day job – working at Coeo as a “database platform engineer” in the managed services side, supporting 60ish clients with some really tricky stuff. I’ve just not had the mental capacity to stick around on AskSSC much other than to occasionally sweep away some of the mountains of spam that accumulate.
The last few years have not been kind to the quieter Q&A boards, though – mountains of spam, the rampant growth of the StackExchanges of this world, and the improvement in accessibility to the suppliers themselves via their support forums have all contributed. And then there’s the suppliers of the Q&A board platforms themselves…
It’s a shame it’s going – it really was a friendly place that helped build the wider SQL community. But the world moves on. I’ll miss it – mainly for the people I met along the way.
The above ramble was brought to you by a lack of sleep and a lack of caffeine.
There I was, coming to the end of my contract with the data science team at NHS Digital, working on shutting down their mahoosive SQL Server, and beginning to think about looking for the next contract. So I was trawling the job sites, as usual, and keeping half an eye on LinkedIn, and I got a message from James Boother, who I’ve known for years thanks to various SQLBits events and SQL Saturday Exeter (I remember what you did, James…) He suggested a chat, which kinda escalated into a couple more formal chats, and suddenly – as in less than one office hour after the fourth such chat – I had an email with an offer and a ridiculously close start date.
August 2021
After the fastest recruitment process I’ve ever seen (once it got going), I have abandoned the world of contracting and gone back to the Permie Life, for a little company called Coeo, as a “Data Platform Principal Engineer”, working in the team looking after their support clients. Two(*) Three months later, my head is still spinning – but that might be less to do with the pace of the work and life generally over the last few weeks. But that’s another story(+).
(*) Yeah, I’ve been taking my time over this…
(+) Don’t ask, unless you’re bringing me beer. Or gin. Gin’s good. Anyway, given the time lapse between starting this post and publishing it, things have calmed down on the life front…
Why Go Permie?
Well… They made me an offer I couldn’t refuse, because it was the offer I requested, and it would have been a bit churlish to turn it down. It’s financial stability for me, which, given the current state of the UK, and the offspring reaching A Levels / University ages mightn’t be a bad thing.
Once upon a time, when I was new to the whole contracting thing, SQL Bits put on a conference at York University. I had nothing else to do that Saturday, it was free, it was only an hour’s drive from home, and there was the chance of meeting a couple of guys I’d been chatting with over on Ask.SQLServerCentral. So I went. The first person I met was one of the guys I’d been hoping to meet – Jonathan “fatherjack” Allen, and the next person was Kev Riley – both of whom were AskSSC regulars. And I also met a few other guys, including the one and only Buck Woody, and topped him up with coffee, but that’s another story, and Kevin Kline, and Rob Farley who greeted me with “Oh, *you’re* the twitter guy”. I guess I had a reputation already. Oh dear. I mean, I’d only been a full-time DBA for a few months.
Aaanyway, where was I? Oh yes, pondering the future while meeting people.
During that day, I think the idea either came to me, or coalesced, that maybe after two or three years of doing short term (three to six month) contracts, I would start hunting around for a permanent role again (at this point, I really had only been a contractor for a few months, and was still in the “permie” mindset). I had enjoyed my time as a consultant many years ago – it had that mix of talking with people across and through an organisation rather than just a small team, a wide ranging role, lots of new challenges – much like contracting, but with a regular guaranteed pay packet. And the short term contracts over that time frame would be, what, 8-10 different environments, different sets of problems, different ways of thinking – so a reasonable exposure to the product set and the way people used it. (I should point out that, at that point, I already had 15 years of experience of SQL Server, having been working with it since 1995, but it had only ever been part of my job – the rest of it being, well, the whole IT career thing – programmer, consultant, architect, manager, general fac totum.)
This is rambling a bit – let’s move it along.
So Who To Work For?
What to look for? Over the next few years, my wishlist of things started coming together. A mix of work types, a solid team, a friendly bunch of people, a challenge, no, regular challenges. Oh, and some training would be nice! I mean, I probably wasn’t going to get all that at a large organisation, and my experiences of being contracted through a certain three-letter organisation was definitely sub-optimal, so I was thinking about smaller (but not too small) firms.
But what sort of firm? Straight consultancy? or a pre-sales technical role? Over the years, I got down to a small list of companies, most of whom will be familiar to those who attend these conferences – Microsoft (obviously not meeting all the criteria, but on the list by dint of being Microsoft), a couple of suppliers (Redgate, obviously, and maybe SolarWinds and SQLSentry/SentryOne), and Coeo.
Coeo logo
Why Coeo?
Well, they offered…
…Yeah, there’s more to it than that.
When I started out in this SQL Server contracting line, they were a small(ish) firm with a reputation for having some of the most qualified SQL Server guys in the country, if not the world. Microsoft used to have a MCM (Microsoft Certified Master) programme, very limited, very expensive, very hard to get. And then they went one better with the MCA – Microsoft Certified Architect. And, in the UK, the company that had the most of these MCMs/MCAs outside of Microsoft was Coeo. And this was not a large company. So I knew they were serious about training and certification.
I definitely remember having a conversation with a couple of their guys at SQLBits in Brighton in 2011. One of those guys was Gavin Payne, who set up the system that I was looking after at that time at MessageLabs. And, given the age of bits of it, he was surprised that some of the servers were the same as the ones he had set up… It turns out that ML/SYMC wasn’t the only company Coeo & I shared as previous clients – yes, even though I’d never worked for them, I was still recommending them, based on their reputation.
I’d kept in touch with them at Bits, and other events, where they were regular sponsors, and the question of working for them did crop up from time to time, but they weren’t looking for remote-based staff, and I would have to be in the office most of the time, and that wasn’t an option – not as a permie, anyway.
And then came Covid, and lockdowns, and home-based working, and they found that they could cope with people being remote, and now they have staff up and down the land – which is one of the reasons that James started trying to get in touch.
Various other #sqlfamily friends have been through Coeo’s ranks, which I’m not quite sure what to think about – but none of them have left really recently, and some of them were there for quite a long time, so maybe it’s all OK really. I’ve not gone asking, because that always seems a bit odd to me. Certainly it’s more a question I would discuss over a beer than over email, and these things are quite often personal.
Anyway. I’ve been there for a few months now, and I finish work most days with my brain fried – I’m not used to having to think this much! I think another few months of this I’ll have hit pretty much every weird and wonderful feature of SQL Server – and every version from 2005 onwards – and I’m keen to keep going.
And, who knows, maybe they’ll even let me blog on their site as well as my own! Although, given the rambling above, I suspect I’ll be edited. Heavily.
Custom extended properties can be a useful tool when creating a self-documenting database. However, in an environment where tables are routinely dropped and recreated by software/systems not within your control, it may be desirable to have these stored so they can be repopulated when the table is recreated.
The obvious way to capture this data for this purpose would be to run a query in a DROP_TABLE trigger and store the output in a separate table for (re-)use later (eg on a create table trigger to repopulate the values as they were when the table is recreated).
USE Testing;
DROP TRIGGER IF EXISTS DropTableGetVars ON DATABASE;
IF OBJECT_ID ('TestDrop', 'U') IS NOT NULL EXEC ('DROP TABLE TestDrop');
GO
CREATE TABLE TestDrop (i INT);
GO
EXEC sp_addextendedproperty
@level0type = 'SCHEMA',
@level0name = 'dbo',
@level1type = 'TABLE',
@level1name = 'TestDrop',
@name = 'Purpose',
@value = 'Table for testing drop functionality';
GO
These first few lines to drop anything that we might be about to use, and to create a test table and add a custom extended property.
Now, a query to check the extended property is there by querying the sys.extended_properties view:
SELECT s.name, t.name, ep.name, ep.value
FROM
sys.extended_properties AS ep
INNER JOIN sys.tables AS t
ON ep.major_id = t.object_id
INNER JOIN sys.schemas AS s
ON s.schema_id = t.schema_id;
And the results:
Drop Table Trigger
Now, a quick trigger to capture (well, show, in this case) the EPs when dropping the table:
CREATE TRIGGER DropTableGetVars
ON DATABASE
FOR DROP_TABLE
AS
BEGIN
PRINT 'Drop table trigger';
SELECT s.name, t.name, ep.name, ep.value
FROM
sys.extended_properties AS ep
INNER JOIN sys.tables AS t
ON ep.major_id = t.object_id
INNER JOIN sys.schemas AS s
ON s.schema_id = t.schema_id;
END;
GO
A very simple trigger – print a message, and run that select statement. The idea being (later) to enhance that functionality by inserting the data into another table.
Anyway, let’s test it:
DROP TABLE TestDrop;
GO
The output:
Drop table trigger
(0 rows affected)
Completion time: 2021-03-06T12:18:39.2286808+00:00
And the results of the SELECT:
Nothing there. Bother. It’s already gone before we get here.
Conclusion
This is the wrong approach for what I need to do – the extended properties are apparently deleted before the drop table trigger is fired. Which is a bore, and means I have to do one (or more) of:
write triggers against the extended properties object itself (no)
rewrite the extended properties procedures (no)
wrap the extended properties maintenance SPs (depends on people using your maintenance SPs rather than the default / standard ones)
write a routine to periodically poll sys.extended_properties and store the results elsewhere (probably the most reliable, assuming these don’t change value regularly, but a faff.)
I was working on some functionality for a create table trigger. I had the whole creation / testing thing wrapped up in a single script for easy testing and possible deployment across multiple databases:
USE Testing
IF NOT EXISTS (SELECT * FROM sys.triggers WHERE name = 'NewTableTrigger')
EXEC ('CREATE TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE AS BEGIN SELECT 1 END')
GO
ALTER TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE
AS
BEGIN
PRINT 'Create Table Trigger Fired'
END
IF EXISTS (SELECT * FROM sys.tables WHERE name = 'TestTable')
EXEC ('DROP TABLE TestTable');
GO
CREATE TABLE TestTable (i INT);
All well & good. (Obviously, the functionality in the trigger was a little more complicated than a print statement…)
Great. Refresh the tables list in the object explorer to go poking around, and:
Hmm. We have a problem.
Add a line to the script to see if anything’s going on there:
USE Testing
IF NOT EXISTS (SELECT * FROM sys.triggers WHERE name = 'NewTableTrigger')
EXEC ('CREATE TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE AS BEGIN SELECT 1 END')
GO
ALTER TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE
AS
BEGIN
PRINT 'Create Table Trigger Fired'
END
IF EXISTS (SELECT * FROM sys.tables WHERE name = 'TestTable')
EXEC ('DROP TABLE TestTable');
GO
CREATE TABLE TestTable (i INT);
SELECT * FROM dbo.TestTable
And run…:
Create Table Trigger Fired
Msg 208, Level 16, State 1, Line 21
Invalid object name 'dbo.TestTable'.
Completion time: 2021-02-27T14:24:44.0301750+00:00
Not just me failing to see it, then.
After some step-by-step testing, I finally found the problem:
USE Testing
IF NOT EXISTS (SELECT * FROM sys.triggers WHERE name = 'NewTableTrigger')
EXEC ('CREATE TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE AS BEGIN SELECT 1 END')
GO
ALTER TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE
AS
BEGIN
PRINT 'Create Table Trigger Fired'
END
GO
IF EXISTS (SELECT * FROM sys.tables WHERE name = 'TestTable')
EXEC ('DROP TABLE TestTable');
GO
CREATE TABLE TestTable (i INT);
SELECT * FROM dbo.TestTable
Yup. That one “GO” missing at the end of the ALTER TRIGGER statement was causing the next part of the script to be executed as part of the same batch, meaning it was part of the trigger code, and the trigger was deleting the table that I was attempting to create.
Have I said how much I hate the new WordPress “add post” page? No? Well, I do. Particularly when editing an earlier post…
Anyway. Where were we? Oh yes. Advent Of Code. For those of you who don’t know, it’s a problem-setting advent calendar where you have to solve the problems in order to progress. One problem with two parts per day. And the problem is usually accompanied by several hundred lines of input data, so you can’t do the problems by hand unless you’ve got way too much spare time.
I’ve tried this a few times over the years, and never managed to stick at it for very long, probably because, as a musician, this tends to be a busy time of year. 2020, though, is a bit different. No, a *lot* different. So I have more time to think about these things.
Loading data with BULK INSERT
Step one in solving these problems in SQL Server is loading the data into a database so you can do something with it.
I chose to use BULK INSERT, because, according to the documentation, it:
Imports a data file into a database table or view in a user-specified format in SQL Server
Ideal, right?
Day 1’s input is a list of numbers. So, in theory…:
IF OBJECT_ID('Day01Input', 'U') IS NOT NULL
BEGIN
EXEC ('drop table Day01Input');
END;
CREATE TABLE Day01Input
(
LineItemInput BIGINT
);
BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt';
GO
Right?
Msg 4864, Level 16, State 1, Line 13
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (LineItemInput).
Wrong.
OK, so let’s try with a varchar(max), in case there’s something weird going on in the file.
DROP TABLE dbo.Day01Input;
GO
CREATE TABLE Day01Input
(
LineItemInput VARCHAR(MAX)
);
BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt';
GO
Right?
(1 row affected)
Wrong. There’s 200 lines in this input. So what’s going on?
mm’kay – what seems to have happened, then, is that the whole text file has been read into a single varchar(max) field. Which is fine, but not quite what I was hoping for. Fortunately, there are a few options you can apply with the BULK INSERT command, so this would seem to be an appropriate thing to try:
DROP TABLE dbo.Day01Input;
GO
CREATE TABLE Day01Input
(
LineItemInput BIGINT
);
BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt'
WITH
(
FORMAT = 'CSV',
DATAFILETYPE = 'char'
);
And…
Msg 4864, Level 16, State 1, Line 34
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (LineItemInput).
Oh.
So how about that ROWTERMINATOR parameter?
BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt'
WITH
(
FORMAT = 'CSV',
DATAFILETYPE = 'char',
ROWTERMINATOR = '
'
);
And…
Msg 4864, Level 16, State 1, Line 51
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (LineItemInput).
(ignore the line numbers – that’s my script of error generators…)
OK, so it doesn’t like the carriage return / linefeed combo. How’s about using CHAR(10)?
BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt'
WITH
(
FORMAT = 'CSV',
DATAFILETYPE = 'char',
ROWTERMINATOR = CHAR(10)
);
And…
Msg 102, Level 15, State 1, Line 48
Incorrect syntax near 'CHAR'.
Oh look. A different error. Still, I suppose this is progress of a sort…
How about we wrap that up in a variable, and pass that in?
DECLARE @RowTerm VARCHAR(1) = CHAR(10);
BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt'
WITH
(
FORMAT = 'CSV',
DATAFILETYPE = 'char',
ROWTERMINATOR = @RowTerm
);
And…
Msg 102, Level 15, State 1, Line 70
Incorrect syntax near '@RowTerm'.
What?
It turns out that every parameter in a BULK INSERT statement has to be a literal, not a variable, so the way to do this is way more complicated than first thought. The good news, though, is that we can ditch the rest of the WITH stuff, and just pass in the ROWTERMINATOR.
DECLARE @Bulk_cmd VARCHAR(MAX);
SET @Bulk_cmd
= 'BULK INSERT AdventOfCode2020.dbo.Day01Input
FROM ''C:\.....\AdventOfCode2020\Dec01-1 - input.txt''
WITH (ROWTERMINATOR=''' + CHAR(10) + ''')';
EXEC (@Bulk_cmd);
And…
(200 rows affected)
Hurrah!
Maintaining Data Order
For the some of the AOC challenges, though, maintaining file / data order is important – so we’ll need to force that somehow. Obviously, SQL Server doesn’t do that in an easy way; if you put the data into a staging table like this one, and then insert it into a table with an IDENTITY (or SEQUENCE) field, you can’t necessarily guarantee that the data will make it in the right order.
The only way to guarantee order is to apply it as the data is loaded. I came across a tip somewhere to create a table with an identity field and a data field, and then a view on that table that just selects the data field, and then BULK INSERT into the view… So, for day three of AOC, I ended up with this:
IF OBJECT_ID('TobogganTreesInput', 'V') IS NOT NULL
BEGIN
EXEC ('DROP VIEW TobogganTreesInput');
END;
IF OBJECT_ID('TobogganTrees', 'U') IS NOT NULL
BEGIN
EXEC ('DROP TABLE TobogganTrees');
END;
CREATE TABLE TobogganTrees (RowID INT IDENTITY(1, 1), InputString VARCHAR(MAX));
GO
CREATE VIEW TobogganTreesInput
AS
SELECT InputString FROM TobogganTrees;
GO
DECLARE @Bulk_cmd VARCHAR(MAX);
SET @Bulk_cmd
= 'BULK INSERT AdventOfCode2020.dbo.TobogganTreesInput
FROM ''C:\.....\Dec03-1 - input.txt''
WITH (ROWTERMINATOR=''' + CHAR(10) + ''')';
EXEC (@Bulk_cmd);
GO