You are given a array points
representing integer coordinates of some points on a 2D plane, where points[i] = [xi, yi]
.
The distance between two points is defined as their Manhattan distance.
Return the minimum possible value for maximum distance between any two points by removing exactly one point.
Example 1:
Input: points = [[3,10],[5,15],[10,2],[4,4]]
Output: 12
Explanation:
The maximum distance after removing each point is the following:
|5 - 10| + |15 - 2| = 18
.|3 - 10| + |10 - 2| = 15
.|5 - 4| + |15 - 4| = 12
.|5 - 10| + |15 - 2| = 18
.12 is the minimum possible maximum distance between any two points after removing exactly one point.
Example 2:
Input: points = [[1,1],[1,1],[1,1]]
Output: 0
Explanation:
Removing any of the points results in the maximum distance between any two points of 0.
Constraints:
3 <= points.length <= 105
points[i].length == 2
1 <= points[i][0], points[i][1] <= 108