Jekyll2018-05-02T21:00:00+00:00https://www.rarelyprolific.co.uk/Rarely ProlificAdventures in software development with countless distractions
Introducing two-way communications2017-06-04T11:25:00+00:002017-06-04T11:25:00+00:00https://www.rarelyprolific.co.uk/2017/06/04/introducing-two-way-communications<p>Up until today this blog was missing an important feature.</p>
<p>Here I am putting the world to rights and no one can scream back at me that I’m talking complete rubbish. Now they can because I’ve integrated comments on my blog
posts using Disqus!</p>
<p><em>Well.. I’m hoping I don’t get shouted at.</em></p>
<p>But it would be nice to know if anyone out there is reading this blog at the moment. I have not made any attempt to publicise this site in any way so far because
I really wanted to get some content on here first and, if I’m honest, prove to myself I would stick with it.</p>
<p>So I would really appreciate it if anyone leaves any comments or feedback. If there are at least a couple of people somewhere in the world who have enjoyed reading
something I have written then it makes the effort more than worthwhile.</p>
<p>Also, I don’t expect people to agree with everything I write. I’d like to think I have some good ideas now and again but I’m old enough to know I’m probably being
naive and ignorant when I comment about subjects I don’t have much experience with. I do try to keep an open mind though and welcome any alternative views and
constructive critisium.</p>Up until today this blog was missing an important feature.They don’t make config files like they used to2017-05-31T21:25:00+00:002017-05-31T21:25:00+00:00https://www.rarelyprolific.co.uk/2017/05/31/they-dont-make-config-files-like-they-used-to<p>I was having a think about configuration file formats the other day and it occurred to me that so-called legacy configuration file formats could be better than
the formats we currently use.</p>
<p>Configuration files are read by applications but maintained by human beings. (I’m not including configurations which are maintained by a GUI administration
application. In those circumstances you’ll never open the file directly as you’ll always use the tool.) Therefore, in my opinion, the file format should be
optimised for people: It should be easy for them to understand and to update manually. All a computer needs to do is be able to read in and parse the file.</p>
<p>Most configuration files just contain a list of key/value pairs. <strong>So.. Simple huh?</strong></p>
<p>Back in MS-DOS and early versions of Windows, <a href="https://en.wikipedia.org/wiki/INI_file">INI files</a> were a very common standard. They generally just held one
setting per line such as: <strong>SOURCEDIR=C:\Source</strong></p>
<p>I’m not sure why INI files got less popular? Maybe it was because we started loading settings into the <a href="https://en.wikipedia.org/wiki/Windows_Registry">Windows Registry</a>.
I think with hindsight most of us probably agree that the Registry is not a great place to store general application settings outside of low-level operating system
options. It became a bucket for everything and recklessly editing the Registry (and getting it wrong) could have some fairly harsh system stability consequences.</p>
<p>After that, I think <a href="https://en.wikipedia.org/wiki/XML">XML</a> became popular. You could store anything you liked in an XML file.. So we stored <strong>EVERYTHING</strong> in XML!
Especially configuration files. The problem with XML is that it contains a lot of “noise”. Schema declarations, opening tags, crazy amounts of attributes, closing tags.
Generally stuff that is part of the file format but isn’t your configuration data. Additionally, if your actual configuration values contained angle brackets or ampersands
you would need to escape them out. Nasty!</p>
<p>Currently we seem to be in love with <a href="https://en.wikipedia.org/wiki/JSON">JSON</a>. It’s probably an improvement over XML but is it <em>really</em> a good format for
human-readable data? Nevertheless, it has widespread adoption as a configuration file format. Just be careful you remember trailing commas and put curly and square
braces in all the right places. Not to mention ensuring you keep the indentation neat to retain some level of readability.</p>
<p><strong>So where do we go from here?</strong></p>
<p>Personally, I’m becoming of fan of <a href="https://en.wikipedia.org/wiki/TOML">TOML</a> which stands for “Tom’s Obvious, Minimal Language”. It’s not currently a standard
and is still being defined. But, it’s clean and very much like an evolved version of the INI format.</p>I was having a think about configuration file formats the other day and it occurred to me that so-called legacy configuration file formats could be better than the formats we currently use.Just learn anything!2017-04-09T14:10:00+00:002017-04-09T14:10:00+00:00https://www.rarelyprolific.co.uk/2017/04/09/just-learn-anything<p>Trying to decide what to learn next is something I hear discussed a lot between developers. Sooner or later the topic of conversation usually ends up talking about whatever technologies are most useful to employers based on whatever the latest job vacancies are listing as <strong>“essential skills”</strong>.</p>
<p>Whilst you should endeavour to keep your skills up to date, I prefer to just focus on learning for the sake of it as a regular activity. This is as opposed to being too concerned with needing to know a particular technology for whatever arbitrary reason.</p>
<p>I suppose I should probably justify that statement with some reasoning.</p>
<p>Firstly, I believe you learn a subject better when you have an interest in it. If it’s something you enjoy doing you’ll happily engross yourself in it to the point where you’ll lose track of time. If it is a chore to you, you’ll probably end up doing the minimum and not digging deeper into the subject in question.</p>
<p>I think developers can get hung up on languages and will often procrastinate on which language to learn over another. I’ve found that once you become proficient in a few languages you tend to easily be able to pick up and learn new languages on-demand as and when you need to.</p>
<p>As an example, I don’t know Java. But I do know other heavily object-orientated languages like C# and VB.NET which run on top of a managed runtime. So if I had to learn Java at short notice, I’m hoping I could give myself a crash-course and get the basics under my belt within a reasonable timeframe based on previous experience.</p>
<p>Languages are just syntax and, this may be a bold statement, but think it is probably fairly easy to learn a new language if you already know a couple which are based around similar programming paradigms. Although, to be honest, if I see a new language my initial thoughts are: <em>What datatypes are available? How do I create and assign to variables? How do I write a basic if statement?</em> Then I learn the rest as I go.</p>
<p>Bouncing between languages can bring benefits. I remember coding a lot of .NET without having much need of or giving much thought to asynchronous programming. Following that, I worked more in JavaScript and found to be effective I had to learn about and embrace thinking about code in asynchronous ways. The really cool part was when I hopped back over to .NET I understood <strong>Task<></strong>, <strong>async</strong> and <strong>await</strong> as a bonus.</p>
<p>I’ve also flirted with C++ now and again, although I’ve never created anything meaningful beyond a basic Windows application with a Close button. My first experience was using <strong>cin</strong>, <strong>cout</strong> and raw pointers. A few years later I came back and modern C++ was a thing. Syntax looks friendlier (from my point of view) and smart pointers felt like something which should have been in there from the start.</p>
<p>Lately I’ve started playing with <a href="https://www.rust-lang.org">Rust</a>: A compiled language which promises the same raw performance as C++ but designed with memory and thread safety as a primary concern. It’s a new language which may or may not go mainstream but I feel that learning it teaches me more about programming in general. So I’m gaining valuable experience either way.</p>
<p>Here’s another: Which JavaScript framework should you learn? Again, I say any! If you learn a couple you find there are plenty of shared concepts between them. <em>One-way and two-way data binding between a JavaScript model and the DOM. Components and modularisation. Routing engines.</em> Once you know a couple, you could probably learn another framework fairly easily. Also your general level of JavaScript experience is going up with each line of code you write.</p>
<p>To sum up: <strong>I believe that, as a developer, doing ANY programming-related activity increases your knowledge. Swapping between languages and programming paradigms gets you thinking about problems and their solutions in different ways.</strong></p>
<p>I don’t think there is anything I’ve ever learnt about computers and technology that I would consider to be a waste of time.</p>
<p>My current obsession is the <a href="http://www.lexaloffle.com/pico-8.php">PICO-8</a> fantasy console. It’s an emulator for an 8-bit console that doesn’t exist! You code it in Lua and it contains a complete suite of embedded development tools. It’s a crazy amount of fun!</p>Trying to decide what to learn next is something I hear discussed a lot between developers. Sooner or later the topic of conversation usually ends up talking about whatever technologies are most useful to employers based on whatever the latest job vacancies are listing as “essential skills”.Practising and Prototyping2017-03-26T20:10:00+00:002017-03-26T20:10:00+00:00https://www.rarelyprolific.co.uk/2017/03/26/practising-and-prototyping<p>This is a continuation of my first blog post, <a href="/2017/01/25/innovation-and-learning.html">Innovation, learning and biting off more than you can chew</a>, which
was written as an example of what can happen when you attempt to embrace too many unknowns into a production workflow.</p>
<p>At this moment in time I consider the act of programming to mainly be a creative medium but definitely on it’s journey towards being an engineering discipline. I’ll admit I’m making this up
as I go along but here’s my thinking:</p>
<h4 id="programming-as-an-engineering-discipline">Programming as an Engineering Discipline</h4>
<blockquote>
<p>I would describe an engineering discipline as a very mature method of construction. It could be making a vehicle, performing a controlled chemical reaction or building a skyscraper. Although
methods and techniques are always evolving, there are some baseline best practices which have been formed over a long period of time. Be that either a hundred years or a significant number of decades.</p>
</blockquote>
<blockquote>
<p>Programming doesn’t have the benefit of time. Although we talk of best practice in software development, if we’re being honest with ourselves, we mean we are trying to use the best method we
know of at this point in time. The act of coding is in a constant state of flux. Programming languages are changing and evolving. Even the hardware we are coding against is a moving target.</p>
</blockquote>
<blockquote>
<p>But, in everyday programming, there are some problems we consider to be already solved. We don’t usually concern ourselves with how graphics are rasterized to a display monitor. We’ll write
code at a higher abstraction level which is more concerned with what we want to show on screen but leave the low level specifics to drivers and APIs. So maybe plotting coloured pixels is an
example of software engineering. Although people are finding better ways of doing it, generally it’s something that all computers do and have long abstracted away into operating systems to take care of.</p>
</blockquote>
<h4 id="programming-as-a-creative-medium">Programming as a Creative Medium</h4>
<blockquote>
<p>Being creative I would describe as the use of original thoughts and concepts. This includes solving problems with lateral thinking. Coding is definitely and primarily a creative medium at this point
in time. If for no other reason than the fact that we don’t really know a foolproof, effective way of creating software yet. It’s common practice to release the initial version of a product or a
service as a beta. It isn’t because the creators have knowingly left in bugs <strong>(we hope!)</strong> but because we expect software to require a few iterations of development to become stable.</p>
</blockquote>
<blockquote>
<p>We’re just trying out new ways of developing software until we discover the most optimal ways to do it. Until then, we’ll keep creating new languages, techniques, processes and methodologies which
attempt to solve the flaws of those which came before. <strong>I think it’s a very exciting time!</strong></p>
</blockquote>
<p>So, for argument’s sake, let’s call programming a creative practice. Practice I think being the apt term. If you look at other activities which are deemed to be creative, it is readily accepted
that you are required to practice them to become proficient.</p>
<p>An example: Let’s say you decide to want to learn how to play the guitar. You can learn music theory and you can even learn the specifics of how to use a guitar. But do you really expect to be able
to actually play a guitar without spending weeks, months or maybe years practicing? I’d say not. So let’s assume you spend a couple of years becoming proficient on the guitar. If you swap over
to a keyboard or piano, you’ll probably need more months of practice to become good with another instrument despite the fact you know more about music at this stage.</p>
<p><em>(Slight disclaimer here: I don’t know much about making music or playing a musical instrument so I’m going along with my, possibly ignorant, beliefs here for the purposes of a reasonable analogy.
Please feel free to correct me (and I’ll correct this) if I’m completely barking up the wrong tree!)</em></p>
<p>So practice is important but we don’t seem to apply the same logic <strong>(pardon the pun!)</strong> to programming. We’ll go on a course, read a book, watch a video and then expect to be able to program
effectively with the knowledge we’ve just acquired. I’ve seen developers sent on courses to learn a technology come back afterwards and be expected to hit the ground running and use it
in a production system. Where is the time to practice?</p>
<p>How times have you started working on a project and thrown the source code away completely three or four times because you keep re-writing it? I’ve done this a few times. Some people consider this
to be a waste of time or a form of failure. I just think of this as my practice. I’m making mistakes, learning from them and figuring out from experience how to do something.</p>
<p>My personal belief is that you don’t need to be clever to be a reasonable programmer. You just need to be prepared to learn and put in a lot of practice.</p>
<p>If the need to practice programming is a real thing, how do you achieve this in a working environment? I do think if you want to be really good you have to be prepared to do a significant amount
of learning and coding in your spare time. But I believe practicing programming also needs to be recognised in the office.</p>
<p>The best way I’ve discovered of doing this so far is by making <strong>prototypes</strong>.</p>
<p>You are trying to create something or solve a new problem at work using existing or new technology but you aren’t sure where to start.. Or if your idea will even work. Depending on the size of the problem
I normally give myself a specific timebox within which I would have expected to make a reasonable amount of progress towards a viable solution. Usually either a morning or an afternoon.</p>
<p>At the end of the timebox I will have “something” which can help me consider what the next stage of development will be and an idea of how long it’ll take. Alternatively, I’ll have nothing of use which
means I either need to look at the problem in a completely different way or it’s just a bad idea.</p>
<p>Within the timebox, I’ll hack on code with an aim to getting something working as soon as possible. I don’t care about TDD, unit tests, classes or even if I’m coding a massive method.
I just want something working. If I get something working early on in the timebox, I may take the time to tidy it up and consider how I would design the code for production.</p>
<p>The point is, I’m making a <strong>prototype</strong> and <strong>practising</strong> solving a problem or creating a solution. I’m expecting to throw this away and be able to accurately estimate how long it will take
to do the job properly. The time certainly hasn’t gone to waste. I’ve got the benefit of experience with a hint of hindsight before I get down to work for real.</p>This is a continuation of my first blog post, Innovation, learning and biting off more than you can chew, which was written as an example of what can happen when you attempt to embrace too many unknowns into a production workflow.Extreme Programming Explained - Book2017-02-12T19:00:00+00:002017-02-12T19:00:00+00:00https://www.rarelyprolific.co.uk/2017/02/12/extreme-programming-explained-book<p>Before I became a developer I thought that the most important skill for the role was hard technical knowledge. That is, being fluent in a few popular
programming languages and acquiring the knowledge and experience to produce a reasonably complex application.</p>
<p>Skip forward a few years and my thinking has changed. This is based on my own experience working as a developer creating software for businesses. Projects
I worked on were typically database-driven intranet web applications or back-end business data processing services.</p>
<p><strong>Now</strong> I would say that hard technical knowledge probably only accounts for fifty percent of the job. The skills which should make up the other
half are so-called “soft” skills: Speaking with the business to accurately determine requirements, working collectively as a team to deliver
a product and fitting development work into a pre-planned timeframe.</p>
<p>In fact I would say even go as far as to say that in day to day situations in a development environment soft skills will help you out much more
than in-depth technical knowledge. <strong>Sounds crazy I know! You are a developer.. You should be coding!</strong> The reality though is programming is often
the part of the job you do at the end of the development process.</p>
<p>You start the work by figuring out what you need to develop? What should it do and what does it need to look like? The beginnings of answers to these
questions can often only be found by initiating conversations with the business, or non-technical people in the organisation who have first-hand
knowledge of the business requirement you are attempting to fulfil. But developers will often look to the code to try and find answers there instead
of prompting a dialogue: <em>We can poke around in the data until it makes sense? There must be an existing class that does the same kinda thing
we can use/modify?</em></p>
<p>The result is often that a development team will slave over a product for months but working off the bare bones of a specification. They’ll do their best to
implement features they think the business needs. Code will get over-engineered with additions that no one asked for. Features the business does
need will never be anticipated because they wasn’t included in the specification and no one ever told the development team to add them.</p>
<p>This way of working can produce applications which don’t do what they need to: <strong>Fulfil the requirements of the business problem they are being
built to solve.</strong> The business guys are unhappy because it doesn’t help them. The developers are unhappy because they’ve worked really hard and,
in the worst scenarios, their application doesn’t even get released into production.</p>
<p>Part of the problem is developers spend a lot of their time studying technical skills. Sometimes from huge books which explain how to craft quality
code without compromise. They come out the other end raring to develop the best applications the world has ever seen. While this approach certainly
isn’t bad it doesn’t fit the real world most of the time. The average business environment is geared around balancing the amount of development effort
required to deliver a product, within a budget to enable the business to <em>(let’s be blunt here)</em> make money!</p>
<p>If you are a developer in a business environment who can relate to the above difficulties I would suggest reading a short book for a change:
<a href="https://www.google.co.uk/search?q=extreme+programming+explained"><strong>Extreme Programming Explained</strong> by <strong>Kent Beck</strong></a></p>
<p>It’s only about 160 pages long and doesn’t require a large investment of reading time. It’s not a technical book but it does give you suggestions
on how to implement technical solutions within the framework of a business environment.</p>
<p>Extreme Programming (or XP for short) is part of the family of development methodologies which is known as “being agile”. If you are used to working
in sprints and having scrums you’ll be aware of agile. If you are working in sprints but are struggling to figure out how to estimate or fit actually
writing the code into the user stories you are planning, XP and this book may help you.</p>
<p>I especially like the pragmatic attitude of this book. It readily accepts that all or some of the practices of XP won’t work in all situations, with
all teams or even with some types of people. But it does try, through realistic suggestions, to help you leverage some of the techniques to make your
life easier. Even if you don’t feel you can, or want to, adopt XP as a whole, I’m betting there will be at least a couple of tips in this book which
you will find useful.</p>Before I became a developer I thought that the most important skill for the role was hard technical knowledge. That is, being fluent in a few popular programming languages and acquiring the knowledge and experience to produce a reasonably complex application.The problem with CSS2017-02-11T12:30:00+00:002017-02-11T12:30:00+00:00https://www.rarelyprolific.co.uk/2017/02/11/the-problem-with-css<p>As a web developer <em>(as opposed to a web designer)</em> I have a love/hate relationship with CSS.</p>
<p>I’ll be the first to admit I’m no designer and I much prefer getting some code functional and operational with a view to making it look
nice later on. This means I absolutely love tools like Bootstrap because I can just spin up something which looks half-decent quickly
and put all my time and effort into the server-side logic.</p>
<p>So Bootstrap is wonderful but you do end up in the situation where your site looks like every other generic Bootstrap site. It’s kind of like
WinForms for the web. Sooner or later you will want to customise your styling and attempt to give it your own touch, however minor or major.</p>
<p>This is where you dive into the stylesheets and possibly end up getting frustrated!</p>
<p><strong>CSS is incredibly flexible.</strong> The cost of this level of power is that you need to take a little time to understand how it works. If your approach
to styling is googling how to affect an element, ending up on w3schools.com and pasting a style into your code. <strong>You may be doing it wrong!</strong></p>
<p><img src="/images/commitstrip-css-joke.jpg" alt="CSS Joke by CommitStrip" />
<em><a href="http://www.commitstrip.com/en/2014/09/26/the-worst-issues-are-not-always-where-you-would-expect-them/">CSS Comic</a> courtesy
of <a href="http://www.commitstrip.com">CommitStrip</a></em></p>
<p>The bottom line is that if you are a web developer who gets involved with front-end code you’ll more than likely have to deal with CSS.
This doesn’t have to be a harrowing experience. In fact it can actually be quite enjoyable <em>(it can also be a nightmare too but let’s stay
positive!)</em>.</p>
<p>I was lucky enough to be sent on a two-day course on CSS3 a few years ago and it definately changed my thinking. Time is precious
but if you dedicate a few hours to learning some key concepts about CSS, it’ll probably make your life much easier.</p>
<p>If you are having issues trying to lay out elements, make sure you are aware of what the <strong>CSS Box Model</strong> is. I won’t try to explain it
all here as there are numerous <a href="http://www.w3schools.com/css/css_boxmodel.asp">better explanations</a> which you can find by searching.
Essentially it is a box which wraps every HTML element. The <strong>content</strong> is in the centre, surrounded by <strong>padding</strong>, which is wrapped by a <strong>border</strong> and
it has a <strong>margin</strong> on the outside.</p>
<p>The easiest way to visualise this if you have Chrome is to right-click any element on a web page and select Inspect. When the developer tools open
up, ensure the Elements tab is selected and scroll all the way down to the bottom of the Style tab on the right-hand side of the page. You should see
a representation of the box model for the HTML element you have selected.</p>
<h3 id="why-does-the-layout-go-crazy">Why does the layout go crazy?</h3>
<p>An important fact to take away from this if you are having problems trying to align an element is that the content, padding, border and margin may
all have explicit widths and heights you need to take into account. Imagine you have a 100 pixel wide container and you put two divs inside it which are
each 45 pixels wide. Each div also has a 2 pixel wide border.</p>
<p>A quick calculation shows the contents of your container should be 98 pixels wide and fit perfectly. But if something isn’t working you may want to check
the values of the padding and margins too. If the entire width exceeds 100 pixels, the browser will most likely try to guess what you intended to do and make
the best fit it can. This can sometimes be the root of your alignment issues.</p>
<h3 id="why-is-putting-content-in-the-centre-so-hard">Why is putting content in the centre so hard?</h3>
<p>Another good way to think of a page with CSS styling is: It’s a big tank of water and all the HTML elements in it float. This means when you create
a new element on the page, by default, it will be pushed right up to the top of the page. Unless it has an defined width it’ll also spread out to
the full width of the page. <strong>So how do you put it in the centre of the page?</strong></p>
<p>Next thing to google here is what is meant by a <strong>block</strong> element and what is meant by an <strong>inline</strong> element. Again, there is great documentation
already online but a block element is basically a box and it can have height and width. An inline element does not have dimensions and usually represents
inline content such as flowing text. <strong>Inline elements are usually constrained within block elements.</strong></p>
<p><strong>So we still need to centre this?</strong> First it’ll need a width which means it’ll also need to be a block element. Remember that logically the width you specify
needs to be larger than the content inside the element <em>(otherwise it doesn’t add up!)</em> and needs to be smaller than the width of the browser screen. Assuming
you have this working so far, the only other style you should need is:</p>
<div class="highlighter-rouge"><div class="highlight"><pre class="highlight"><code>margin: 0 auto
</code></pre></div></div>
<p>The zero refers to the top and bottom margin and auto refers to the left and right margins. Auto just means balance out the left and right margins equally,
meaning your element should now be centred in the page.</p>
<h3 id="css-is-global-so-deal-with-it">CSS is global so deal with it</h3>
<p>In most areas of software development we’ve decided that using global scope for anything usually causes us problems somewhere down the line so we’ve
trained ourselves to encapsulate behaviours and properties into objects. I don’t think this practice has made it into the CSS world yet.</p>
<p>In fact the “Cascading” part of the name seems to encourage you to create a top-level (i.e. global) style and let it flow down to embedded elements across multiple pages.
Which is a great idea in a perfect world if you know what the final structure of your HTML will be.</p>
<p>In the real world, sites evolve organically, different types of pages are added to sites and layouts are tweaked. Through experience what I’ve found happens is that
global styles will be created but they will be overridden multiple times by styles targeted at lower level elements.</p>
<p>You can diagnose if this is happening by looking at the Styles tab in the Chrome developer tools. When you are inspecting an element, any style with a strike-through
is a style which would have targeted your element but another style has overridden it. You can tweak styles on and off in the developer tools using checkboxes.</p>
<p>Overriding a style is not necessary a bad thing but if you are doing it excessively <strong>and</strong> having trouble styling your website, it could be an indicator that you
should stop, refactor and simplify your CSS.</p>
<h3 id="there-is-a-default-css-stylesheet">There is a default CSS stylesheet</h3>
<p>When I began playing with CSS, some of the issues which plagued me that I couldn’t resolve related to layout behavours when I had not even written any style rules.
<strong>Where is that margin or spacing coming from? I haven’t even created any margin rules yet!</strong></p>
<p>So, it turns out the browser has it’s own default stylesheet which is applied. These may differ slightly between different browsers and browser versions but
the <a href="https://www.w3.org/TR/CSS2/sample.html">W3C default stylesheet for HTML 4</a> is a good baseline reference for what is probably happening in your page by default.</p>
<p>As a side note, there are so-called <strong>CSS resets</strong> out there of varying degrees. These are stylesheets which attempt to completely or partially override any styling applied by the browser
default stylesheet. It’s up to you but I would consider using CSS resets to be an advanced topic which you can safely avoid unless you are more curious.</p>
<p>The default stylesheet shouldn’t get in your way and in the vast majority of cases it helps you out by setting up common HTML tags to display in the way you would expect them to.
<strong>Such as ordered and unordered lists having a left margin. Or headings being in larger, bolded text.</strong></p>
<h3 id="a-few-suggestions-to-keep-your-css-manageable">A few suggestions to keep your CSS manageable</h3>
<p>Finally, here’s a few tips:</p>
<ul>
<li>
<p><strong>Put ALL your styles in a CSS file:</strong> Don’t put any styles inline. You’ll regret it later if you have styling spread all over your site. Keep them in a style sheet.
If a style only ever needs to target a single element, put it in the stylesheet and target it at the ID of the element.</p>
</li>
<li>
<p><strong>Split your styles in multiple CSS files if it makes sense:</strong> If you have built up many rules it may make sense to create different stylesheets for the types of rules
you have. You could have a typography stylesheet which only holds font-related styles. You could also have a layout stylesheet which only holds rules relating to spacing and alignment.
Or, you could put all the rules related to styling a HTML component on your page in a separate stylesheet.</p>
</li>
<li>
<p><strong>Use top-level styles but don’t make them too opinionated:</strong> Generally, only apply a top-level style if you are confident you want that style to apply to <strong>EVERYTHING</strong> on your page now
and in the future. Out of habit, I usually set <strong>body</strong> to have a <strong>font-family</strong> of <strong>sans-serif</strong> to banish the Times New Roman font but I’ll target all other rules at lower level elements. Especially
anything which is affecting layout.</p>
</li>
<li>
<p><strong>Refactor your CSS early:</strong> Large and messy CSS is a nightmare to change! You don’t have unit tests to protect you. If you change a rule which is applied site-wide, you’ll have to view all the pages
in your site to verify you haven’t broken anything. <em>Alternatively, tweak the style and pray it works!</em> It’s preferably to avoid getting into this situation if you can though. While you are coding
your CSS, if you see an opportunity to simplify a rule while you are creating it or just after and can easily verify the effect of the refactoring.. <strong>DO IT!</strong> I don’t believe large scale CSS refactoring
after the fact really works.. You end up just scrapping the stylesheet and starting over as it’s the more workable solution. Your objective should be to style your site with the most simple,
straightforward stylesheet you can. The prospect of changing or adding a style in the future should not fill you with dread.</p>
</li>
<li>
<p><strong>This is not a CSS tutorial or even a guide to best practice:</strong> There are numerous great tutorials for CSS on the web so please don’t trust my opinions on this page as even being examples
of best practice. <em>I’m a developer, not a designer.</em> What I hope I have shown you is how to avoid a few gotchas and what you can research to be able to tackle CSS with confidence.</p>
</li>
</ul>As a web developer (as opposed to a web designer) I have a love/hate relationship with CSS.Aiming below perfection2017-01-30T20:55:00+00:002017-01-30T20:55:00+00:00https://www.rarelyprolific.co.uk/2017/01/30/aiming-below-perfection<p>What I am about to write about is not a new and revolutionary idea. It’s something I’ve read about countless times before.
It’s also something I have fairly consistently ignored over the past couple of decades!</p>
<p>That is: <strong>The idea of just starting doing something! Like now! RIGHT NOW!</strong></p>
<p>If you are anything like me, the voice in your head which blocks you most of the time is the one which tells you that a job worth doing
is worth doing well. You should be aiming for high quality and going for perfection. Except, that is a very high standard
to set yourself.. Especially when embarking on a new project.</p>
<p>The reality is that I often have an idea and start planning it out in my head. I may fire up the computer and code something.
Time ebbs away. Then I’m not happy so it gets tweaked. I have some more ideas about how it could be even better and get distracted.
More time passes. Eventually, real world responsibilities catch up with me and I run out of free time. The result is whatever
I was working on gets either abandoned or mothballed for a long time.</p>
<p>Here’s another approach: <strong>Do something quick! Fix it up later!</strong></p>
<p>That’s right, I’m implying you should throw something out there. It won’t be perfect. There will a hundred things wrong with
it. You can fix it up along the way but you’ve started.</p>
<p>I’ve been meaning to start this blog for roughly five years now. Yeah, you heard that right.. <strong>FIVE YEARS!</strong> So just after last Christmas
I set myself a goal that I would have something up and running and at least one blog post published by the end of January 2017.
<em>(This is post two so I’m exceeding my own expectations here!)</em></p>
<p>If I’m honest about the process. I did procrastinate horrifically at times and the scope crept up significantly. The following blocked me:</p>
<ul>
<li>Learning how to use Jekyll and GitHub Pages</li>
<li>Installing Ruby and Ruby Gems</li>
<li>Looking up how to get Jekyll working well on Windows</li>
<li>Not liking the default Jekyll theme and failing miserably at getting other themes working</li>
<li>Persevering, changing tact and finding a theme I actually, partially, liked</li>
<li>Tweaking the theme until I liked it enough</li>
<li>Getting it all building locally and committing to Github</li>
<li><strong>WRITING AND PUBLISHING A BLOG POST!</strong></li>
</ul>
<p>So I must be happy now. Yes, it’s working.. I’ve got something done and you are reading this <em>(hopefully reading down this far too!)</em>
It’s far from finished though and there is a ton of stuff I still want to improve, such as:</p>
<ul>
<li>I need to tag my posts and articles with categories</li>
<li>I want to have a search feature</li>
<li>Not happy with my writing style yet and my grammer sucks</li>
<li>I need more graphics in the blog posts to separate the text</li>
<li><strong>AND SO ON!</strong></li>
</ul>
<p>But.. I can still iterate on what I have already achieved and make it better. Absolutely nothing is set in stone.</p>
<p>The point I hope I’m making is that achieving something small can often be motivation in itself to keep building on what you have
already started. <strong>No more blank canvas. The wheels are already turning!</strong></p>What I am about to write about is not a new and revolutionary idea. It’s something I’ve read about countless times before. It’s also something I have fairly consistently ignored over the past couple of decades!Innovation, learning and biting off more than you can chew2017-01-25T21:55:00+00:002017-01-25T21:55:00+00:00https://www.rarelyprolific.co.uk/2017/01/25/innovation-and-learning<p>There is a constant drive for innovation in software development. Organisations want to be seen to be embracing new technology and
most developers have a built-in desire to play with new toys.</p>
<p>Here’s a scenario I was involved in a few years ago.</p>
<p>We had a trusty ASP.NET WebForms application which performed a few simple but vital internal business functions. It didn’t look very nice
but it was used by a few users who seemed to be content with it and we had no complaints <em>(and we definately got complaints about the
stuff they didn’t like!)</em>. The back-end code was a little gnarly but generally it worked well and did it’s job.</p>
<p>But the world moves on and we had to change a few services and databases which propagated data around our systems. This meant we also had to modify our
WebForms application to maintain compatibility. After some analysis it became apparent that we would have to make some fairly
extensive changes to the nasty back-end code of the application as the shape of the source data was now significantly different.</p>
<p><strong>So we already had a reasonably large task ahead of us.</strong> But there was also a nagging feeling that this was a really ugly old legacy application.
We could do all the necessary work rewriting the back-end code but it would still look like a 1990’s web application on the front-end.
None of the users would notice anything had changed, which could be viewed as a positive, but there would be no recognition of the amount
of blood and sweat it had taken to upgrade the application.</p>
<p>We decided it would be really good if we could give the front-end of the application an upgrade too. We were already mashing up the code so
why not! Everyone was using JavaScript these days so we should really be doing some of that too. We also wanted a modern looking design and
smooth data refreshes via AJAX instead of reloading entire pages.</p>
<p><strong>This is all good stuff but our reasonably large task had now grown to significantly massive.</strong> Even though we didn’t realise this at the time.
To be fair, we did know that changing both the internal data processing and external design would be a fairly large job but if we just
did the back-end first we wouldn’t get the chance to do the front-end design we really wanted to work on.</p>
<p>As it happened we succeeded with what we set out to do and we got it all done by the required deadline. The back-end code was upgraded so it
did exactly what it needed to do <em>(it would have been game over if that wasn’t the case to be honest)</em>.</p>
<p>The front-end was.. interesting! We had indeed given the application a whole new user interface. It was a little rough around the edges but definately looked more modern.
It worked but we had a few niggly user experience issues, such as the page loading a half blank page instantly and then most of the data appearing a couple of
seconds later when an AJAX request returned.</p>
<p>All things considered, was this a successful project? If you look only at the end result I think we won. It worked! The user experience wasn’t perfect but the
original application didn’t have great UX either.. We just had different issues in the new version.</p>
<p><strong>I’m a big believer in the concept of EVERYTHING having a cost.</strong> Especially in software development. In fact, modern software methodologies often tell
us to <a href="http://ronjeffries.com/xprog/articles/practices/pracsimplest/">“do the simplest thing that could possibly work”</a>. It’s fair to say we completely
broke this rule and then some.</p>
<p>The hidden cost for us, which isn’t apparent from the above explanation, was the amount of stress and pulling our hair out we had to go through to
get the front-end code working. If we were taking an MVP (<a href="https://en.wikipedia.org/wiki/Minimum_viable_product">Minimum viable product</a>) approach
we probably should have initially just worked on the back-end code.</p>
<p>But, with hindsight, the overarching issues which exasperated the development process more than anything were:</p>
<ul>
<li>No one on the team knew much about JavaScript development before we started the work.</li>
<li>We underestimated how much learning we would have to do “on the fly”.</li>
<li><strong>Therefore..</strong> We couldn’t begin to anticipate some of the issues we ended up having to deal with whilst writing and debugging the code.</li>
</ul>
<p>It’s easy to judge this as an observer and ask why a bunch of developers would dive head first into problems they don’t have a clue how to solve (or how
to even estimate with any degree of accuracy for that matter) but I’ve personally witnessed this scenario happen at more than one organisation and
with both novice and experienced developers.</p>
<p>I think it happens when you have a workplace culture which doesn’t allocate any real time to enabling developers to learn new skills.
It can happen when a developer’s time is <strong>ALWAYS</strong> tied to delivering a feature, so there is no time in reserve for personal development. They can get around
this limitation by actually producing the new feature using exciting new languages and libraries whilst learning them simultaneously.
Two birds, one stone! Developers are great problem solvers after all.</p>
<p>But the cost is usually significantly more time spent in the debugger trying to diagnose problems based on code you don’t really understand at this
point (googling stack overflow to the rescue!). This is in addition to the bugs you have to fix in the business logic which was always part of the
task anyway. Oh, and get it done by the same deadtime too. <strong>Stress! Argghh!</strong></p>
<p>Personally I don’t think this practice works. You are taking on risk which you can avoid. You may deliver on time.. Or you may not deliver at all if you
can’t get the new technology to do what you need it to. Either way, you are probably going to end up putting yourself under more pressure and stress than
you need to and you aren’t going to be at your best if you are doing that on a regular basis.</p>
<p>Part of the solution is that your organisation (if they don’t already) should really have a company strategy for developer personal development. Good developers
generally learn in their own time but this behaviour should be engrained into the workplace too.</p>
<p>Ultimately though, developers should have the discipline to learn and try out new skills in safe environments and keep unknown or unproven technologies away
from the production codebase. (I’ll elaborate on techniques for doing this in a future post.)</p>There is a constant drive for innovation in software development. Organisations want to be seen to be embracing new technology and most developers have a built-in desire to play with new toys.