Olive Branch Technology
  • Services
    • Management Consulting >
      • Professional Developmet
    • Big Data/Analytics
    • Software Engineering
  • Dollar Dashboards
  • News and Information
  • Contact
  • Learning

News and Information

Case Study: Olive Branch Helps Client Reduce Data Costs 99%

6/26/2020

0 Comments

 
About our case studies. To guard from disclosing information that our clients may not want shared, we do not use company per individual names in our articles.
Picture
Data storage is has become complicated. Your options for for database technology are far greater than in the past, each solution offering different strengths and weaknesses. I remember a time when if you wanted to store any meaningful volume of data, your options were Microsoft SQL or Oracle ... and then eventually MySQL. But with the increase in available computing power and the drop in the cost of high performance drive space the number of options for database solutions has skyrocketed. Choosing the right technology to match your requirements can poivde a high performance, low cost solution, but using the wrong solution can kill your budget. Read on to learn how Olive Branch Technology slashed one company's data storage costs by more than 99%.
One of our clients was faced with mounting database costs resulting from a system that was built on top of Microsoft SQL. We're not going to knock MSSQL here, it is an absolutely fantastic relational database. It is easy to maintain, powerful and fast; an impressive technology. But with that power comes cost. Operating large scale relational databases is not cheap. As you add up licensing costs, server costs, drive costs, etc. you need to allocate a good chunk of your budget to operating the database. Even if you are using a hosted solution such as those offered by Amazon Web Services or Microsoft Azure as your costs rise significantly when you are working to genuinely large datasets.
Databases are not one-size-fits-all.
​
Knowing how to choose and use the right tool for the job can save your budget.

The wrong technology, or not knowing how to use the technology you have chosen, can kill your budget.
When our client first came to us, their database costs were only a few thousand dollars per year. However, forecasts of business growth put their database costs in up into the six figures. Upon careful review we characterized their data needs as follows:
  • Data needed to be retained indefinitely.
  • It was unusual to actively use old older data, but in the event it was needed, the data needed to be available immediately.
  • Most of the data storage volume was time-series data.
  • Almost all of the actively used data was 30 days old or less, with the majority of the day-to-day operations using data that was less than 24 hours old.
  • Most time series data was accessed in either 1 day chunks or 30 day chunks, it was extremely rare to have to operate on more than one month of data at once.
  • Combining or aggregating data across multiple sources was extremely rare.
A few things jumped out at us:
  1. Using a relational database like MSSQL for time series data is complicated. A solution that works for a relatively small volume of data won't scale well. Scaling time series data is complicated and gets expensive with relational databases.
  2. The hot-data, the data being accessed frequently, was really only 24 hours old at the most.
Here's what we did.

Scrapped the relational database. As wonderful as MSSQL is, it just wasn't the right solution for this system. It was expensive, and as  the business grew, it would be difficult to maintain performance of the time-series-data analysis.

Introduced NoSQL databases, For basic entities like account data, customer information, etc. that needed to be stored, we created a solution that leveraged Google Datastore (now Firestore). This involved denormalizing the database and setting up structured entities to save in DataStore. Fast and inexpensive, we could store all of the data entities we needed with minimal cost
Our bottom line as improved greatly after working with Jim and his team at Olive Branch Technology"
​ - Vice President, Undisclosed Company
Used Big Query for time series data. Big Query is a data warehouse. A data warehouse is designed for storing and batch processing huge volumes of data but are not necessarily know for having zippy access times. Given that the majority of the information was going to sit and wait for if, or when, it needed to be accessed, we decided to put all of the data into a Big Query data warehouse. However, we did leverage a high degree of partitioning (breaking the data storage into chunks) that allowed us to very quickly find and retrieve the data that we needed, when we needed it.
Caching strategies. For user interfaces that needed fast response times, we implemented a caching strategy that functioned much like a real-time database. Keeping commonly accessed information readily available while performing longer data interactions asynchronously in the background, we were able to match, and even exceed response time when compared to leveraging a SQL server for active storage.

As a result of these changes,  for the first year of operation data storage costs were reduced by 99.7%. We expected the costs savings to continue to be strong, but not that strong. After a few years in production and having reached anticipated data volumes (many orders of magnitude beyond year one volumes) we followed up by pricing out hosted SQL server instances necessary to meet the current demands and found that depending on the solution provider, our chosen technology stack still represented a 99.7% to 99.9% savings over the original architecture.

This kind of savings is atypical. This system happened to have just the right combination of requirements to allow us to leverage high performance, low cost solutions. It is also important to note that there is not anything inherently inexpensive about Google's data storage solutions. In fact, this same combination of storage technologies, when used without careful attention to implementation could cost thousands of times more per month.

The key, is knowing what your needs are, knowing what technology solutions can best meet those needs, and knowing how to best implement those solutions. It's all about choosing the right tool for the job.
​
0 Comments

Tool Highlight: Automatic Versions

5/21/2020

0 Comments

 
Picture
When we're developing software, for our own products or for our clients, we have always been diligent about managing the version numbers of our compiled code. With reliable versioning we can better track when issues were discovered or introduced and at what point they were fixed, it provides a consistent model for applying labels in source control, allows us to monitor aging of bug reports, but I digress. I'm not here to discuss the virtues of version numbering, I'm here to share a tool we've been using that has made managing version numbers, particularly in large projects, much easier. 

We've been using a wonderful Visual Studio extension,  Automatic Versions by Precision Infinity. The tool allows us to control the rules for how version numbers are assigned, automate some parts of the version number while maintaining manual control of the incrementing of other parts of the version number. It helps is maintain consistency of versions across multiple projects with almost no effort. This has been a HUGE time savor for us.

Picture
0 Comments

Tool Highlight: Spell Check for Visual Studio

5/13/2020

0 Comments

 
Our new favorite extension for Visual Studio is the Visual Studio Spell Checker by Eric Woodruff. The plug in offers a word processor quality spell check utility to Visual Studio code for elements such as comments, string literals, and content (such as html). It knows not to try to spell check variables, html tags, scripts, etc. With spell check as you type and integration with Visual Studio's suggestions menu, embarrassing spelling errors in code comments, documentation, and content produced in Visual Studio are a thing of the past. Thank you, Eric, for releasing such a helpful tool.
Picture
0 Comments

The challenge of website analytics data

5/12/2020

0 Comments

 
Anyone with a website should be tracking their traffic. It isn't hard to do, and it isn't expensive. Google Analytics offers their platform for free. It isn't a stripped down oversimplified tool. It is a full featured data analytics platform. As the head of the analytics team we used the paid version of Google Analytics and other than traffic volume limitations, support, and a few other geeky technology kinda things, the free version is the same. Now if you're getting more than 10 million hits per month or have an analytics team that is programming a hundred custom data points, then you will hopefully be able to afford Google Analytics Premium. For the rest of us... it is free and it is one of the best products on the market.

Here's the problem. With any complex and powerful tool, using it isn't very straight forward. When ever I help a client get their a digital product up and running, I make sure they have Google Analytics installed so they can track traffic/usage. Without exception, when they log in to Google Analytics, they have no idea what they are looking at. Now with a little training I can usually get them to  the point where they can find some useful information. Unfortunately the process is so cumbersome that they rarely, if ever, log back in. As an experiment, I had someone log into Google Analytics and do a little scavenger hunt to find the basic information. I didn't ask them to find anything complex. I just ask for simple items like "how many page views did you receive yesterday" or "how many users visited your site in the last 30 days". To get a collection of the basic information, enough to get a good understanding of the traffic to their website it took 45 clicks.  So rather than spent time regularly checking into their traffic, they rely on me to update them periodically ... which comes at a cost.

This is why Olive Branch Technology developed Dollar Dashboards. We first created these dashboards to automatically update our own clients about their traffic and now we're offering it to anyone. For just $1  per month (to cover our hosting costs) you can get your data delivered directly to via email and get access to an easy to read online dashboard.


Picture
0 Comments

Stop Asking Programmers to Code in Interviews

5/5/2020

0 Comments

 
I'll just come right out and say it: Asking a programmer to code during an interview is a bad idea.

I've seen many interviews in which the interviewer - usually one of the programmers on the team the interviewee is hoping to join - asks the interviewee to write some code, either on a piece of paper or on a whiteboard or sometimes on a computer ... though rarely. I'll even admin to being guilty of doing this when I was younger. I've watched my staff ask candidates to write code, I've watched my teams debate over what they should have a candidate code during the interview. Eventually I'll intervene and put a stop to it. Here is what I tell them.

Asking someone to write code during an interview does nothing to give you insight into their programming ability. 

A) No one writes code on paper or a white board ... not ever.
B) Modern programming tools are full of features such as auto-complete, immediate error checking, beautification, etc. None of which are available on a white board.
C) Programmers have always relied on looking up syntax, code snippets, and how-to's on the internet. If you aren't willing to let your candidate do the same during the interview, then you shouldn't be asking them to code.
D) Programmers will remember syntax, features, functions, and algorithms they use often, and will not remember syntax, features, functions and algorithms they use infrequently or not at all. Will you know in which of these categories your programming tasks fall?

I can hear the push-back now. "But if we stick to basic coding, they should be able do that."

To which I answer - if your coding task is that basic, it doesn't tell you anything

Again, I can hear the push-back. "It will help us weed out people that can't code."

To which I answer ... if a candidate for a programming position has made it to a face-to-face interview and they can't code then your candidate selection process leaves much to be desired. Perhaps fix that first.

There are other questions you can ask that will tell you about their programming capabilities, but it is important to let them draw from their own experience. Interviewing a programmer should be a discussion and if you ask the right questions, they can lead to meaningful discussions that will tell you more about your candidate than any coding assignment you throw at them.

Tell us about a particularly frustrating programming challenge you faced. A question like this allows the candidate to tell you about a real problem from their own experience, so it takes out any bias you introduce by choosing a problem for them to work on. It will open the door for you to ask how they solved the problem, to understand the nature of a challenge the candidate might struggle with, and dig into how creative they get when solving a problem.

What is the worst coding error you've ever seen? I haven't met programmer yet that hasn't seen some really bad code that is burned into their scull. The ability to explain the code and why it is particularly bad will tell you a lot about their own skills as a programmer. If they can't recall any bad code there is a good chance they aren't good enough to recognize bad code when they see it ... and in all likely hood are producing bad code.

What development environment (IDE) do you use? What is one thing that frustrates you about it. First, if they can't discuss their own development environment then they are a novice at best. Second, anyone that spends quality time hunkered down writing code will have something that frustrates them about that environment. Being able to articulate that will tell you a lot about their abilities and should lead to a discussion about programming topics and practices that the candidate is particularly passionate about.

Tell us about some coding conventions you or your team has. Again, anyone that doesn't know what a coding convention is probably doesn't have enough experience for you. This question shouldn't be used to judge the conventions themselves (programmers love to judge each other and each others teams, but now is not the time). The range of answers you get back will be huge. Anything from using Pascal case for variables to database access conventions to code performance analysis practices. Regardless of the complexity of the convention, the response should open the door for you to have further discussions such as the benefits or drawbacks of the convention, and offer opportunities for the candidate to demonstrate their knowledge and experience.

Questions such as these foster a dialog between  the candidate and the interviewer(s). That dialog is where you will get an understanding of their experience and abilities. Giving them a programming task is usually more of a test of their ability to memorize than anything else and memorization ability most definitely does not equate to programming skill.

0 Comments

Olive Branch Technology Launches Dollar Dashboards

4/30/2020

1 Comment

 
Olive Branch Technology is pleased to announce the launch of Dollar Dashboards, a simple website metrics dashboard designed for busy website operators and business owners that don't have time to spend wading through mountains of data and reports. Integrating with Google Analytics there is nothing to install (for existing Google Analytics users). With just a few clicks, customers can start receiving daily traffic summers via email with a comprehensive and easy to understand single-page dashboard available online - all for just a dollar a month.

Visit http://www.dollardashboards.net for more information.
1 Comment

Deceptive Data Presentation

2/7/2020

0 Comments

 
 I was listing to the radio the other day when a story was being presented about the different in opinion among age groups. The nice authoritative sounding voice on the radio stated that group A had a positive opinion about the topic in question with 60% responding 'yes' . Group B was presented as having a negative opinion on the topic with 42% of group be responded 'no'. The way it was presented - verbally and with emphatic language stating the clear difference between the groups was very convincing. However, think about the numbers - one group at 60% yes, another group at 42% no. This being a yes or no question - the responses from these two groups are almost identical - 60% yes vs 58% yes. I was fascinated at how easily the results could be misrepresented - without altering the data - to support a position.

Ever heard the phrase "the data doesn't lie?" Nope. Forget what you learned growing up about data being objective. It isn't. Like most information, it can be spun and manipulated to present different angles on a story.

The technique presented above isn't entirely uncommon. 

1. Inverting data to emphasize a point. Often used in yes/now, positive/negative data sets you will find folks presenting data in a way to support a position. For example let's showing that 55% of respondents were in favor of a referendum. If one wanted to make a case against a particular referendum then stating that 45% of respondents were opposed can paint the referendum in a negative light.

2. Playing with scale. When presenting data graphically, the choice of axis scale can have a  big impact on how the data is perceived. Take a look at the two charts below. The chart on the left shows a sizable spike in the middle whereas the chart on the right is relatively flat. However, both charts were generated using the exact same data. The only difference is the range on the y-axis.
Picture
3. Misuse of pie charts. This is one of my favorites. In all fairness, I usually see this done less to be out of an attempt to mislead and more out of a lack of understanding of how the charts works and the nature of the data. As an example, lets say I conduct a survey and ask people to choose what type of salty snack they like - potato chips, pretzels, Doritos, tortilla chips. They can choose as many as they like. The results are 55% of respondents like potato chips, 20% like pretzels, 50% like Doritos, and 40% like tortilla chips. (totally made up data, by the way). You may encounter people using a pie chart to represent the results, like the chart below. The problem here is that the responses are not mutually exclusive, so the percentages do not add up to 100%.
Picture
4. Accumulating non-cumulative data. Not all data can be added up. I want to measure how many people play basketball in my community each month this winter. December I count 220 people, January I count 230 people, February I count 180 people.  Awesome. Now if I am asked how many people played basketball total this winter, the answer is most definitely not 630.  The answer is ... you don't know. 
0 Comments

Aligning Professional Development with Business Needs

2/8/2018

6 Comments

 
Freebees 
Template - Professional Development Planning Matrix
Sample - Professional Development Planning Matrix
Professional Development Worksheet

Read on to learn more about how to use them....

Professional development and annual reviews have become dirty words for many technical organizations. It shouldn't be that way. ​
Picture
As a technical manager, you know full well that your staff needs to continuously grow their skills or they become obsolete. Your team knows this as well. It becomes too easy to sit back and let your staff run with it - to self direct their own professional development. When this happens, when professional development is not managed, I often find that A) employees learn new skills only on an as-needed basis, B) professional development aligns more closely with hobbies, or C) teams are drawn to expensive conferences (in warm places) that substitute for professional development.

Learning on an as-needed basis - On the surface, this type of on-the-job learning seems like a good idea. What better way to learn than by doing? And obviously the skills are directly applicable to the business.  This works fine for little things - maybe using LINQ for the first time in a project, or perhaps runtime image resizing in a Django web service. But when it comes to larger areas of growth, say database optimization theory, usability, design patterns ... you need to focus on such things with purpose. Don't think you can just let them happen organically.

Leaving it to your staff - generally, when left to guide their own professional development, you'll find that your technical team members gravitate to those areas they feel are fun - the shiny new objects. This isn't true for everyone, but generally, you'll find it holds. The problem here is that this time spent learning and growing skills that aren't applicable to your business is either time wasted, or worse yet, results in forced application of technology and techniques that don't fit the needs of the organization.

Conferences - Conferences can be great - they can introduce your teams to new technologies, new ideas, and new ways to go about their work. But do keep in mind that conferences and those hosting them have an agenda - they want to push particular solutions, products, technology, or vendors. This isn't aways in the best interest of your company.

Approaching Professional Development with a Purpose.
There is something you can do to help focus the energy of your team onto professional development activities that more closely align with your business needs. Consider the following five steps to get you started.

Step 1  - know what skills you need. What skills do you need from your staff? Make a list. If you've never done this before, it is a good exercise - you'll find this list handy when you're writing job descriptions as well. Just go ahead and list every skill you need from your staff -don't get too specific, think high level (knowing how to perform GREP search is too specific ... perhaps broader, like understanding of Linux).  Account for theoretical skills, for example object oriented design, database optimization theory, algorithms, etc. Consider soft skills too - communication, teamwork, leadership skills. 

Step 2 - break them down by role. Depending on the size of your organization you'll have different levels of specialization. A small startup may require their team members to have a broad set of skills. A large company will have greater specialization - DBAs, server administrators, technical architects, programmers, project managers, testers, etc. Know what kills you need from each role.

Step 3 - For each role, identify expectations. Depending on the size of your company, you may have a well defined hierarchy (e.g. Programmer level 1 - 6) while smaller organizations have more of an implied hierarchy (e.g. Karen is our top developer). Whether you have formal labels for your hierarchy, or simply a gradient from entry level to expert, think about what you expect from each level. For example, you may expect an entry level programmer to have moderate C# programming skills, a basic understanding of database design, a basic understanding of the Windows operating system, basic communication skills, negligible leadership skills and a strong knowledge of object oriented programming. On the other hand, you might expect your senior most programmer to be an expert in the programming language, have strong knowledge of operating systems, decent communication skills, strong mentoring ability and an expert understanding of algorithms. Now think of all the gradients in between - what might it look like as your entry level developer grows to become an expert over the years.

Step 4 - Map it out. I've always preferred to use a table to map out the skill expectations. Create a row for each skill and a column for each level within the organization (formal or informal). For each level, identify the expectation for the skill from none at all through expert. Here what such a table may look like:
Picture

Do this for each role within your team. Once you do this you will have created a roadmap to guide each team member's professional development.

Step 5 - Make a plan. Perhaps you make it part of a formal employee review process or perhaps you make it part of informal mentoring discussions. Sit down with each member of your team and have a discussion about where they best fit on this chart - which column best describes them. For example, in my sample table, a programmer may best fit as a level 4. Now look at what differentiates a level 4 from a level 5. You've just identified what skills this individual should focus their energy on. Once you've done that, you can begin the discussions surrounding how they improve the skills - perhaps it is on-the job training, maybe it is taking some classes, participating in professional organizations, conferences, etc.

You'll find that once you can identify specific areas of growth it will be much easier to create action items to include in a professional development plan. You'll also find that your teams own self-directed professional development activities will better align with your business needs.

You can download and modify these files to get you started:
Template - Professional Development Planning Matrix
Sample - Professional Development Planning Matrix
Professional Development Worksheet

Need some help getting started? Want someone to come in and help you work through the process? Contact Olive Branch Technology at info@olivebranchtechnology.com or call ​(414) 436-9110.
6 Comments

Estimation Matters

1/17/2018

0 Comments

 
Over the years I've seen very little attention given to estimation of software projects. The work put into estimation isn't particularly glamorous and the work is difficult.  Iakes years of experience to learn, and is never perfect. Yet with so little attention given to estimation, consider that one of the most common, and earliest questions asked when considering a software project are "what hard will it be?", "when can it be finished?" The answers to which, though typically given by an expert, are often given with little detailed analysis. 

The consequences of poor estimation can be severe, though they aren't always obvious. Consider the following effects of poor estimation.
  1. Schedule overrun - Without attention to estimation, the project is subject to schedule overrun. Failing to deliver the project on - or close to - plan results in resources not becoming available for subsequent projects (a domino effect of delays), cost overruns, and a lost of confidence from project sponsors.
  2. Resource burnout - Without proper estimation of effort, to hit deadlines you will find that your resources will need to work long hours - evenings and weekends - to hit deadlines. Having to do so on a regular basis leads to resource burnout and ultimately turnover.
  3. Quality problems - To hit unrealistic deadlines corners are often cut. These cuts often occur in testing, but you'll find that quality suffers due to rushed execution and tired resources.
  4. Wasted hours -  When overestimation occurs (rare, but it does happen!) you will find that resources will still leverage the entire duration of the planned schedule. Where the project could have been completed early, freeing resources for other projects, they will be used inefficiently, filling the time allotted. Overestimation will often masquerade as accurate scheduling when in reality you experiencing missed opportunities.
Accurate estimation is well worth the time, but you need to be willing to expend the effort to do it correctly. It is worth the effort to have your teams learn to do it right.
0 Comments

The Launch of Olive Branch Technology

12/16/2016

0 Comments

 
As of December 19th, 2016, Olive Branch Technology has officially launched. Offering a unique blend of management and technical consulting, Oliver Branch Technology is uniquiely positioned to assist its clients in realizing success of their technology strategy. James Conigliaro, owner and principal consultant, at Olive Branch Technology brings with him years of both executive and hands-on technology experience. Most recently serving a dual role as the Vice President of Digital Strategy for the Milwaukee Journal Sentinel and Senior Director of Digital Technology and Analytics with its parent company, Journal Media Group, James oversaw the transition of the digital operations from its formation as a spinoff from a merger between Journal Communications and E.W. Scripps through its subsequenty acqusition by Gannett. Beyond his executive experience, James has taught software engineering at the university level for more than a decade, brining is expertise in planning, designing and executing the development of complex software system from the field into the classroom.

Whether you need an interim CTO or just want to bounce your ideas off of someone - Olive Branch Technoloy can help. Whether you need to craft a full big-data strategy or simply need help pulling together some analytics dashboard - Olive Branch Technology can help. Whether you need a complex software automation solution or just an extra set of eyes on a code review, Olive Branch Technology can help.

Contact Olive Branch at info@olivebranchtechnology.com.
0 Comments

    Archives

    June 2020
    May 2020
    April 2020
    February 2020
    February 2018
    January 2018
    December 2016

    Categories

    All
    Case Study
    Data
    Tools

    RSS Feed

Home

Management Consulting

Big Data & Analytics

Software Engineering

Conta​ct

Copyright © 2016, Olive Branch Technology
  • Services
    • Management Consulting >
      • Professional Developmet
    • Big Data/Analytics
    • Software Engineering
  • Dollar Dashboards
  • News and Information
  • Contact
  • Learning