Details
Description
TableSkewCostFunction uses the sum of the max deviation region per server for all tables as the measure of unevenness. It doesn't work in a very common scenario in operations. Say we have 100 regions on 50 nodes, two on each. We add 50 new nodes and they have 0 each. The max deviation from the mean is 1, compared to 99 in the worst case scenario of 100 regions on a single server. The normalized cost is 1/99 = 0.011 < default threshold of 0.05. Balancer wouldn't move. The proposal is to use aggregated deviation of the count per region server to detect this scenario, generating a cost of 100/198 = 0.5 in this case.
Attachments
Attachments
Issue Links
- links to
(6 links to)