I am pretty new to javascript and working on problems from leetcode.
The description included is: "Given an array of unique integers salary where salary[i] is the salary of the employee i.
Return the average salary of employees excluding the minimum and maximum salary."
When I run my code it says that I have the incorrect output when the following array is passed in.
[25000,48000,57000,86000,33000,10000,42000,3000,54000,29000,79000,40000]
Expected Output: 41700.00000
My Output: 41000.00000
I've compared my code to other submissions and as far as I can tell mine should run the same. Here is my code:
function average(salary) {
var sortedSalary = salary.sort();
var total = sortedSalary.reduce((curr, acc) => {
return curr + acc
}, 0);
var result = (total - sortedSalary[0] - sortedSalary[sortedSalary.length - 1]) / (sortedSalary.length - 2);
return result;
};
console.log(average([25000,48000,57000,86000,33000,10000,42000,3000,54000,29000,79000,40000]));
Thank you for any insight into this.
You're sorting the array alphabetically, not by the smallest number, so you end up removing the wrong numbers.
Use sort((a, b) => a - b) instead:
const salary = [25000,48000,57000,86000,33000,10000,42000,3000,54000,29000,79000,40000]
function average(salary) {
var sortedSalary = salary.sort((a, b) => a - b);
var total = sortedSalary.reduce((curr, acc) => { return curr + acc }, 0);
var result = (total - sortedSalary[0] - sortedSalary[sortedSalary.length - 1]) / (sortedSalary.length - 2);
return result;
};
console.log(average(salary))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With