← Home
write a go solution for Description:
You are given an array a of n points in k-dimensional space. Let the distance between two points a_x and a_y be sumlimits_i=1^k|a_x,i-a_y,i| (it is also known as Manhattan distance).

You have to process q queries of the following two types:

- 1 i b_1 b_2 ... b_k — set i-th element of a to the point (b_1,b_2,...,b_k);
- 2 l r — find the maximum distance between two points a_i and a_j, where l<=i,j<=r.

Input Format:
The first line contains two numbers n and k (1<=n<=2*10^5, 1<=k<=5) — the number of elements in a and the number of dimensions of the space, respectively.

Then n lines follow, each containing k integers a_i,1, a_i,2, ..., a_i,k (-10^6<=a_i,j<=10^6) — the coordinates of i-th point.

The next line contains one integer q (1<=q<=2*10^5) — the number of queries.

Then q lines follow, each denoting a query. There are two types of queries:

- 1 i b_1 b_2 ... b_k (1<=i<=n, -10^6<=b_j<=10^6) — set i-th element of a to the point (b_1,b_2,...,b_k);
- 2 l r (1<=l<=r<=n) — find the maximum distance between two points a_i and a_j, where l<=i,j<=r.

There is at least one query of the second type.

Output Format:
Print the answer for each query of the second type.

Note:
None. Output only the code with no comments, explanation, or additional text.