Sunday 30 December 2012

Standard Costing: Material variances (Price, Usage, Mix & Yield)


By Jackie, Researcher
Topic: Education
Area of discussion: Cost & Management Accounting
Chapter: Standard Costing – Material variances (Price, Usage, Mix & Yield)


The objectives of this posting are to guide students in the computation of all material variances, to share a random picked ACCA Paper 8 Managerial Finance’s question with clear step-by-step workings and explanation, and finally show you how to double check your answers. Ideally, professional exams like ACCA and LCCI require students to compute advanced variances (i.e. direct materials mix and yield variances). Normally, students will not face any problems in handling direct materials price and usage variances, but struggling in solving those advanced variances (students often confuse when normal loss exists). Hopefully, this sharing will help students to understand this topic clearer and better.


The breakdown of the materials variances



Formulas and descriptions:

Total direct materials variance
The total direct materials variance is the difference between the standard materials cost for the actual production and the actual materials cost. Alternatively, it can also be computed by summing up direct materials price variance and direct materials usage variance.
Total direct materials variance = standard materials cost – actual materials cost
Total direct materials variance = direct materials price variance + direct materials usage variance

Material price variance
The material price variance is equal to the difference between the standard price and the actual price per unit of materials multiplied by the quantity of material purchased:
Material price variance = (standard price per unit of material – actual price) x quantity of material purchased

Material usage variance
The material usage variance is equal to the difference between the standard quantity required for actual production and the actual quantity used multiplied by the standard material price:
Material usage variance = (standard quantity of materials for actual production – actual quantity used) x standard price per unit

Materials mix variance
The materials mix variance arises when the mix of materials used differs from the predetermined mix included in the calculation of the standard cost of an operation. If the mixture is varied so that a larger than standard proportion of more expensive materials is used, there will be an unfavourable variance. When a larger proportion of cheaper materials are included in the mixture, there will be a favourable variance.
Materials mix variance = (actual quantity in standard mix proportions – actual quantity used) x standard price

Materials yield variance
The materials yield variance arises because there is a difference between the standard output for a given level of inputs and the actual output attained.
Materials yield variance = (actual yield – standard yield from actual input of material) x standard cost per unit of output 




Answers and comments:





Additional readings, related links and references:

This link provides an extremely good and detailed step-by-step calculation and there are a lot of worked examples. Full formulas are provided and alternative methods for computation are shown clearly.

Materials mix and yield: Relevant to ACCA qualification paper F5. An extremely good discussion on variance analysis with excellent illustration, worked examples and clear explanation.

Standard Costing 2 Material Variances: “Managerial Accounting SFCC Fall 2007 Chapter 9 Videos

This link provides a number of standard costing examples. There are a total of 6 parts in it. Good site to look at in order to master variance analysis.

Material mix and yield variances: Grahame Steven explains why understanding material mix and yield variance is a recipe for success.

Friday 21 December 2012

The New Salesman

The mix of good habits and even better
planning will be rewarded!
We’ve all got to start somewhere

Contrary to popular belief, good salespeople don’t grow on trees.  Nor can you drive by the local mission and pick the one with the best lettered “will work for food” sign.  Lastly, I want to drive a stake through the heart of that hundred year old myth, that certain parents beget “natural born sales wizards”.  On the contrary, good salespeople are created, molded, shaped, formed and trained.  Not so good salespeople can develop bad habits, learn the wrong kinds of behaviors and morph into non-performing drains on their employers.

Unfortunately, our industry has a habit of tossing new guys the car keys, pointing to the territory and saying “go get ‘em, kid,” than providing any kind of real training.   This was an issue back in our dad’s day, a mistake in our day and a gargantuan mistake today.  Here’s why.  According to the research conducted by Matthew Dixon and Brent Adamson in their book The Challenger Sale, the difference in results of average and top rank salespeople is growing.  In the old days of purely transactional selling (Hey Mister, want to buy a Programmable Controller?) the difference between the star seller and the average guy was 59%.  In our world, the world of knowledge-based solution selling, the difference between star and average is over 200%.

The kind of slow progress toward the top that comes from trial and error learning impedes profits.  Think about this.  Most new sellers start off just a shade below average in performance.  Once they get to average, their performance is still 200% less than a top performer.  And, if we can get them to a point of statistically half way between average and star status, they are 100% more effective than the average guy. 

Now I know what a lot of you are thinking, we hire engineers, technicians, classically trained professionals.  In a world of knowledge-based selling this is all good, but it still doesn’t equate to sales moxie.  What’s worse, since most of the leaders of this industry grew up in the age of zero training, many of us don’t really know what a good training program looks like. 

I believe the first several six months sets the stage for success by laying down a number of good habits which morph into a foundation for growth.  Yet very few distributors invest time into establishing a plan for ramping up the new person.  Instead they stand back and watch, after a few months they may realize there’s a problem.  After a couple of informal talks with an already frustrated new salesperson, they finally decide to enroll the guy down at the local Dale Carnegie franchise.  Everyone struggles, money is lost and eventually the salesperson in question either figures thing out, or is flushed as a hiring error.

Finally, here is a thought from another industry expert:  In The Little Black Book of Strategic Planning for Distributors, Brent Grover estimates the cost of getting a new distributor salesperson to the point of being a profit generator at more than $150,000. But, he says, “if distributors capitalized these costs as one would a piece of production equipment – instead of writing if off as a current expense – that would be an asset on the balance sheet of at least $150,000 per salesperson.”  The sooner you can get your new guy up to speed – the better.  I know companies where this number is far greater.

Now a few weasel words from yours truly.  All of this is dependent on you making a sound hiring decision and hiring errors do happen.  Further, it depends on you having a reasonable sales process, good supply partners and the finances to drive the training home. 

With all of this in mind, join us as we spend the next few weeks exploring the ins and outs of a good “on boarding” process.

 
Distributor Planning Made Easy. Check out our Distributors Annual Planning Workbook:
http://amzn.com/1481196448

Thursday 20 December 2012

Comparison of Net Present Value and Internal Rate of Return Methods


By Jackie, Researcher
Topic: Education
Area of discussion: Cost & Management Accounting
Chapter: Capital Budgeting & Cost Analysis


The objective of this posting is to discuss, explain, and justify the superiority of NPV over the IRR. We will look into the conditions which make IRR method becomes inappropriate for usage purpose (i.e. the IRR’s technical shortcoming).


Introduction
Net Present Value Method
It calculates the expected monetary gain or loss from a project by discounting all expected future cash inflows and outflows back to the present point in time using the required rate of return. In other words, it is the present value of the net cash inflows less the present value of the net cash outflows (if any), and then minus the project’s initial investment outlay. A positive NPV indicates that an investment should be accepted, while a negative value indicates that it should be rejected. A zero NPV calculation indicates that the firm should be indifferent to whether the project is accepted or rejected.
Internal Rate of Return Method
It is the rate of return promised by an investment project over its useful life. It is sometimes referred to simply as the yield on a project. The internal rate of return is computed by finding the discount rate that equates the present value of a project’s cash outflows with the present value of its cash inflows. In other words, the internal rate of return is the discount rate that results in a net present value of zero. The decision rule is that if the IRR is greater than the opportunity cost of capital, the investment is acceptable as it is profitable and will yield a positive NPV. Alternatively, if the IRR is less than the cost of capital, the investment should be rejected as it is unprofitable and will result in a negative NPV. When the IRR is equal to the opportunity cost of capital, the firm should be indifferent to whether the project is acceptable or rejected.   

Comparison of Net Present Value and Internal Rate of Return Methods
In many situations the internal rate of return method will result in the same decision as the net present value method. In the case of conventional projects (in which an initial cash outflow is followed by a series of cash inflows) that are independent of each other (i.e. where the selection of a particular projects does not preclude the choice of the other), both NPV and IRR rules will lead to the same accept/reject decisions. However, there are also situations where the IRR method may lead to different decisions being made from those that would follow the adoption of the NPV procedure. 
Mutually exclusive projects 
If projects are mutually exclusive (i.e. the acceptance of one project excludes the acceptance of another project), it is possible for the NPV and the IRR methods to suggest different rankings as to which project should be given priority. For example, choosing one out of three possible factory locations. When evaluating mutually exclusive projects, the IRR method is prone to indicate wrong decisions (i.e. incorrectly rank projects) due to its reinvestment assumptions especially when dealing with unequal lives or unequal levels of initial investment outlay. For instance, compare an investment of £ 10,000 that yields a return of 50 per cent with an investment of £15,000 that yields a return of 40 per cent. If only one of the investments can be undertaken, normally managers will choose the project which has the highest IRR, but bear in mind that in actual fact, the first investment will only yield £5,000 but the second investment will yield £6,000. Thus, if the objective is to maximize the shareholders’ wealth the NPV provides the correct measure.
Percentage returns 
We can sum NPVs of individual projects to calculate a NPV of a combination or portfolio of projects as NPV method is expressed in monetary terms, not in percentages. For example, Project Alpha consists of two smaller projects: South (NPV = £12,500) and West (NPV = £7,500). Then, the NPV of Project Alpha will be £20,000. In contrast, IRRs of individual projects cannot be added or averaged to represent the IRR of a combination of projects.
Volatile cost of capital 
NPV method can also be used when the cost of capital varies over the life of a project. For instance, Vortex Plc has made an initial investment of £10,000 and expected to receive cash inflow as much as £25,000 in year 1 when the cost of capital is 12%, followed by another cash inflow of £18,000 when the cost of capital is 10% in year 2 and finally, £5,000 cash inflow in year 3 when the cost of capital is 8%. Then, the NPV can be calculated as £31,163 (see below). It is not possible to use IRR method in this case. This is because different cost of capital in different years means that there is no single cost of capital that the IRR (a single figure) can be compared against to decide whether the project should be accepted or rejected. 

Reinvestment assumptions 
The assumption concerning the reinvestment of interim cash flows from the acceptance of projects provides another reason for supporting the superiority of the NPV method. The implicit assumption if the NPV method is adopted is that the cash flows generated from an investment will be reinvested immediately at the cost of capital (i.e. the returns available from equal risk securities traded in financial markets). However, the IRR method makes a different implicit assumption about the reinvestment of the cash flows. It assumes that all the proceeds from a project can be reinvested immediately to earn a return equal to the IRR of the original project. This assumption is likely to be unrealistic because a firm should have accepted all projects which offer a return in excess of the cost of capital, and any other funds that become available can only be reinvested at the cost of capital.
Unconventional cash flows 
When the signs of the cash flows switch overtime (i.e. when there are outflows, followed by inflows, followed by additional outflows and so forth), it is possible that more than one IRR may exist for a given project. In other words, there may be multiple discount rates that equate the NPV of a set of cash flows to zero (see below). In such cases, it is difficult to know which of the IRR estimates should be compared to the firm’s required rate of return.


Additional readings, related links and references:
“Perils of the Internal Rate of Return”: This is an extremely good link. It heavily focuses on discussion with clear and detailed examples. Calculations and graphs are all provided. This is highly recommended for those who are doing a research project on this issue or doing revision for coming exams.
“Which is a better measure for capital budgeting, IRR or NPV?”: This site is suitable for beginner. It provides a brief explanation, examples and concepts. It can give you a quick understanding.
http://www.investopedia.com/ask/answers/05/irrvsnpvcapitalbudgeting.asp#axzz2Favl7l6v

“Chapter 6 - Investment decisions - Capital budgeting”: Well, this site looks like an e-book to me. The good thing about this link is it offers more detailed calculations instead of theories. Complete formulas are given too. 

“Net Present Value Vs Internal Rate Of Return (NPV & IRR) & Excel Calculations For DCF”: If you prefer to learn via hearing instead of reading, then this might suit you. 

“How to calculate NPV and IRR in Excel”: The voice is clear, good explanation and most importantly, it is a step-by-step tutorial approach.

Wednesday 19 December 2012

Better Point of Sale Reports with Variance Analysis (update)

I've just revised and updated one of the most popular posts on this blog adding more detailed descriptions,  a graphical view to the output and more clearly showing path to action based on these reports.  Follow the link below to the updated post.

Better Point of Sale Reports with "Variance Analysis": Velocity, Distribution and Pricing.. oh my !

Routine, weekly point-of-sale reports tend to look very similar.  For various time buckets (Last week, last 4 weeks, year to date) we total sales in both currency and units then compare to prior year.  Add in a few more measures to look at retail pricing, inventory,  or service level metrics and you may struggle to make it fit on a page.   And from a CPG standpoint, POS  reporting is only half of the story: a CPG's sales targets are not based on POS, they are based on shipments to the retailer.  How can you get a good overview of POS and reconcile that with Shipments all in one report?

Monday 17 December 2012

Better Business Analytics - Christmas list

It's that time of year again: my kids have written, re-written and re-re-written their Christmas lists now so we all hope Santa will read them carefully and take notice.

With just a few days left before the holiday season hits I wanted to do something a little more fun, so I've pulled together a list of things that I think every Business Analyst should want.  Some are free to acquire, just costing your time to learn, others you may wish to ask Santa (or your CFO) to provide.







64 bit Operating system, applications and more memory

32 bit operating systems cannot recognize more than 4 GB of computer memory (RAM), regardless of how much you load onto your hardware.  Forget "Big Data" for a moment - you can fill 4GB relatively easy with desktop tools - if you want to do any serious data processing on your laptop/desktop environment you will need more than that.  RAM is cheap, easy to upgrade and most modern laptops will take up to 8GB without any problem. Max out your RAM

Note: 8GB of RAM for the laptop I am writing this on is currently $41.99 at www.crucial.com.


Hard-drive upgrade

Solid State Disks (SSD) provide huge speed improvements over their spinning, hard-drive counterparts.  If you are are crunching numbers from file or a local database the hard-drive may well be slowing you down.  A modest investment (the 480 GB Sandisk SSD is currently available on Amazon for $360) can save you a lot of time.

Serious computing power

If you have access to a high powered server this may be of less use, but I was surprised recently to find out just how much computing power you can now get in a desktop workstation.  This beast with 2 CPUs (32 cores), 128 GB of RAM and 2TB of super-fast SSD hard-drive costs a little over $6000.  That's an awful lot of power (more than most servers I've worked with)  for the price of 2-3 good laptops.

Excel 2010 with PowerPivot.

Excel is a superb tool for interacting with your data for prototyping and occasionally for delivering models and results.  It can't do everything and for many problems you will need to turn to more specialized tools but any analyst that tells you they don't use it at all is probably not a very effective analyst.  With the ability to write VBA, to customize and embed your own calculations, it can be a very powerful modeling tool indeed.

With Excel 2010 and the free PowerPivot  add-in, Excel now has the capability to embed  powerful BI capabilities too.  Integrate data from different sources, load enormous amounts of data, vastly more than the worksheet limit of about 1 million records and define more complex measures and functions to apply to this data with DAX (the Data Analytic eXpressions language).  If you are not already there upgrade now to 2010 - make sure its the 64 bit version to blow past the 4GB memory limitation.

Note: Office 2013 may be just around the corner, but with Microsoft's latest offer if you buy now (Oct'12 thru April '13), you get Office 2013 for free when it's released.   Microsoft's Office Pre-Launch Offer.

Time to learn R for statistical computing

OK - I know - those of you who are heavily invested in another package (SAS, SPSS, Statistica, ...) do not get this.  Those are great tools and why should you change?  Well, I'm not suggesting you swap one package for another - I'm suggesting that there is room in your head for more than one tool and R has a lot going for it.

Pros
  • According to Revolution Analytics R is now available as an embedded, in-database analytics engine for Oracle,  SAP HANA., Neteeza, TeraData and others.  This is a very exciting development allowing advanced statistical capabilities to be called from within the database from SQL. Handle routine analytics in-line and on-demand.
  • It's free - really - free.  R is open source tool  you can download from http://www.r-project.org/ 
  • It has packages to do an enormous variety of statistical analyses, you will probably not need or use even 90% of them.
  • It has great graphical capabilities for visualization.
  • It's callable from .NET languages too via RDotNet.  The interface is not well documented, but it does work.
Cons
  • The user interface is "ugly" unless you love command line applications - I don't.  RStudio does help with this.
  • If you are not used to vector based programming languages it may take you a while to grasp.  (try "The Art of R Programming" for help.)
  • There are a lot of commands to remember, I use my cheat sheet a lot.

Time to learn SQL

 In reality the time to learn SQL was probably about 20 years ago, but if you have not yet done so, catch up quickly. 

SQL (structured query language) is the programming language for relational databases.  Whenever you interact with a database it is some dialect of SQL that interacts with the database to select update or insert your data efficiently.

Most of your data will now come to you from a database of some form.  It is a common requirement to integrate data from multiple sources (without losing any), filter, aggregate and sort as a precursor to more advances analytic routines.  This is what the databases does superbly well - if you're doing this in Excel or indeed any non-SQL environment, you are working to hard.

While you are at it, get a good grasp of database design principles and normalization too.

2013 - the year for Column Oriented Databases ?

Column storage is not a new idea in relational databases, but perhaps it's an idea that is about to mature.  Regular, relational databases physically store rows of data together which is very efficient for transactional databases that update and retrieve a few rows at a time.

column-oriented database stores entire columns of data together.  This is very efficient for systems where aggregates are computed over large numbers of similar rows - such as in data warehouse reporting and analytics

Notably, SQL Server 2012 released its first direct support for column storage this year.  It's not perfect, particularly as the implementation limits how you can write to the database, but it is fast.  My own testing on a large database showed a 10-fold increase in speed on large aggregation queries and almost linear scaling as the number of fields chosen (and hence the number of columns that must be accessed) changed.

I've also had excellent performance from InfiniDb available as a free "community edition" though I suspect I have not fully tested it's capability with tables that only have a few hundred million records :-)

Column Storage is exceptionally fast and could allow for significant changes in system architecture for reporting and analytic tools.  How about POS analytics?  Straight from database to reports without the need for any custom-architecture, pre-aggregation, cubes or waiting on batch processing?  Want to update a few Item attributes and re-run your reports now?  No problem !  (Embed in-database analytic capability so you can do more than add/subtract/multiply/divide and you have a serious number-crushing platform).


The #1 item for your list - a (much) bigger dry erase board.

At times, it seems I can't think clearly without a dry-erase board and pen.  My own workhorse is a 8'x4' board that sees heavy daily use, but bigger ideas need more space.  Amazon has a dry-erase paint on offer that I may have to try - enough for about 30 linear feet of wall for $500.

Jumbo Whiteboard Dry Erase Paint Clear 240 Sq Ft




What do you think?

So what do you think should be on every good Business Analyst's Christmas list?   Am I missing this year's top gift?  What should a "naughty" analyst receive - more bad data?  Let me know in the comments section.





P.S.  Santa - if you're reading this, I rather like this T-shirt from CafePress.com too.








Friday 14 December 2012

Leadership


By Jackie, Researcher
Topic: Education
Area of discussion: Management
Chapter: Leadership: Transformational Leadership


The objective of this posting is to share a real-life example of transformational leadership. This example is based on Siemens (German multinational heavy engineering and electronics conglomerate) and its previous leader, Von Pierer (CEO of the company from 1992-2005). This posting emphasizes more on practical application as compared to theoretical concept. Ideally, leadership is extremely important in determining a company’s survival ability nowadays. It is a process by which a person exerts influence over other people and inspires, motivates, and directs their activities to help achieve group or organizational goals.




      Siemens was in a hard time during 1992 because of rising worldwide competition, having an inflexible hierarchy as well as practising conservative culture which greatly reduced decision making speed, stifled creativity and innovation (Jones & George 2003, p.459). Fortunately, Siemens’ Chairman, Von Pierer has taken a ‘shift-in-style’ approach by utilising transformational leadership. He removed two layers of middle management, downsized workforce by 7.5% through early retirement and sold slow-growing businesses at $2 billion (Miller 1995, p.53). Decision making processes were also speed up through the creation of new management boards (Boddy 2005, p.385). At the new Siemens, subordinates were given chances to critique their managers, who were in return receiving training to be more democratic and participative while employees were given ample ‘speaking freedom’ to express their thoughts.
 
      Jones and George (2003, p.460) highlight that Von Pierer has successfully transformed his subordinates in three essential ways. Firstly, he has ‘brainwashed’ his employees’ passive mindset and increased their awareness about the importance of their jobs as well as high performance to attain Siemens’ goals. For instance, upon realising that microprocessor sales managers were not acting on their best as they thought that their job are unimportant, he called Siemens’ top customers like Opel, Ford and Sony to critic and express their dissatisfaction for receiving lousy service and unreliable delivery schedule. Secondly, his subordinates are aware of their own needs for growth, development, and accomplishment. For example, Von Pierer has organised numerous workshop and training session as well as developing fast track career programs like TOP (Time-optimized processes) and launched a high-profile educational campaign for all level of employees. Thirdly, he also motivated his workers to work for the good of the company, not just for their personal gain or benefit. This can be seen when he tried to make all employees to think in the similar manner by inserting self-addressed postcards in the company magazine, urging them to send their ideas for improvement purposes directly to him.

      He also engaged in development consideration such as providing counselling sessions with a psychologist for managers who face difficulties in adapting to Siemens’ changes and sponsoring hiking trips to stimulate employees’ thinking and work in new ways (Miller 1995, p.52). One of the greatest outcome is a team of Siemens’ engineers working in jeans in a rented house has developed a machine-tool control system by just using one-third of the time and cost as compared to previous system. The effectiveness of Von Pierer has been portrayed in GLOBE research related to German leaders, where “tough on the issue, soft on the person” strategy seems to be the ultimate recipe for success in Siemens (Brodbeck, Frese & Javidan 2002, pp.21-24).


Appendix



References

 Boddy, D 2005, Management: An Introduction, 3rdedn, Prentice Hall, Harlow, p.385.

 Brodbeck, FC, Frese, M & Javidan, M 2002, ‘Leadership made in Germany: Low on compassion, high on performance’, Academy of Management Executive, vol.16, no.1, pp.21-24, viewed 24 September 2012, http://bschool.nus.edu/Departments/ManagementNOrganization/publication/MichaelFreseJournal/brodbeck%20frese%20javidan%20ame02%20germany%20globe.pdf

Jones, GR & George, JM 2003, Contemporary Management, 3rd edn, McGraw-Hill, New York, pp.459-462.

Miller, KL 1995, ‘Siemens Shapes Up’, Business Week, 30 April, pp.52-53, viewed 12 October 2012, http://www.businessweek.com/stories/1995-04-30/siemens-shapes-up


Sunday 9 December 2012

Why Don't We Sell More? in Graphic Form

I practice "mind mapping" even though a lot of these turn out looking a little more like flow-charts than true mind maps.  This came out of a client discussion last week.  Do others find value in this type of chart?

Distributor Planning Made Easy. Check out our Distributors Annual Planning Workbook:
http://amzn.com/1481196448

Tuesday 4 December 2012

But I'm a Salesperson, NOT a Planner!

I’m Already Working 12 Hours a Day, Do You Want Me Selling or Planning?




Many distributors have discovered that it takes a very long time to establish new customer relationships.  Others find that in spite of success with existing products, their organization’s ability to launch related products necessary for long-term business health are hampered.  

The simple truth is our industry has slowly adapted a dangerous habit.  We practice a reactive sales model.  It’s not an immediate threat, but long term, it’s crippling.  Here’s how it works.  An existing customer calls with a question or support issue on some past purchase.  The salesperson reacts to this issue immediately.  Along the way some excellent customer support is provided.  The customer compliments the seller and potentially rewards this behavior with another purchase.  Everything sound good so far?  The unfortunate part of this equation takes a while to manifest itself.   Our ability to find new customers or expand the product technologies we sell is compromised. 

Planning lies central to this issue.  Or more precisely, the lack of planning is slowly cutting and constricting the life’s blood of future success.  During the past decade, in spite of technology tools to expedite its impact, planning has fallen on hard times.  This can range from simple things like setting appointments to investing the time matching product introductions to customer need.

While conducting research for our book, The Target Driven Sales Process, we discovered many salespeople still used product-of-the-week selling.  With pre-thought, no metrics, and no analytics, these folks randomly hand out product literature in a time-consuming, scattergun approach.


Planning improves efficiency.  Yet to a certain segment of our sales force, this concept seems counterintuitive.  Taking time to reinforce planning is frustrating because most of those in management positions take it for granted.  There really is a new reality.  It truly is different from back “in our day.”  Smartphones, iPads, electronic literature and lots of other gadgets allow a rookie to disguise their planning deficiency to a point.  Planning is different from 15 years ago, but the skills are still essential to success.  
We believe the first step in planning must involve the process of matching customer needs to our product offerings; most call this targeting.  Face-to-face customer time is a precious commodity. Wasting a single second talking about a product for which the customer has no interest is a travesty.  Wasting the time of a product specialist or other team member should be a hanging offense. 


Distributor Planning Made Easy.  Check out our Distributors Annual Planning Workbook:
http://tinyurl.com/DistributorAnnualPlanning