We are proud to present our findings for the No Fuss Reviews 2012 console failure rate survey. Additional information can be found at the bottom of this page.
What does each chart mean?
Chart 1 is the overall failure rate of each console for this years survey.
Chart 2 displays the results from each of our surveys from 2009, 2010, 2011 and 2012.
Chart 3 displays how many times a console needed to be replaced as answered by the survey participant for all consoles.
Chart 4 is the year on year failure rate for the Xbox 360 showing how many times the console needed to be replaced as answered by the survey participant.
Chart 5 is the year on year failure rate for the PS3 showing how many times the console needed to be replaced as answered by the survey participant.
Chart 6 is the year on year failure rate for the Wii showing how many times the console needed to be replaced as answered by the survey participant.
Chart 7 shows a break down of failure rate for each type of Xbox 360 console for each year we conducted the survey.
Chart 8 shows a break down of failure rate for each type of PS3 console for each year we conducted the survey.
How was the survey delivered?
The results were compiled via a survery that reacted to provided responses and asked questions where additional details were required.
Was the survey only on No Fuss Reviews?
No, it was delivered through-out the No Fuss Reviews Network which consists of 42 websites as of July 2012.
How long was the survey conducted over?
The survey was conducted over 245 days. It started on the 20th of November 2011.
Under what criteria was the survey presented?
It was presented at a 1/3 random basis to any person who:
a) arrived at our sites from any search engine looking for any console specific game or article
b) had not already taken the survey
How many submissions were logged?
In total we measured 507,132 submissions, we discarded submissions in our final result after the 500,000 mark.
What was the demographic of the survey?
28% answered questions relating to the Wii
35% answered questions relating to the PS3
37% answered questions relating to the 360
How did you try to stop duplicate submissions?
We asked for users to supply their email address and this was stored in our database for the sole purpose of checking for duplicate submissions.
We also applied a session limiter, which only allowed users to take the survey once during a browser session. Finally we used randomised cookies and stored IP addresses for users who had taken the survey to again try to prevent duplicate submissions.
How was the survey presented?
The survey was presented as opt-in after the user had been viewing thr page for 5 seconds or longer.
The survey questions were presented one at a time depending on the previous answer.
Was any other information collected?
As well as IP addresses, browser versions were measured live to try and prevent 'bots' from taking the forms or predefined scripts.