On the Isotropic Constant of Random Polytopes

Let X1,…,XN\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X_1,\ldots ,X_N$$\end{document} be independent random vectors uniformly distributed on an isotropic convex body K⊂Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$K\subset \mathbb {R}^n$$\end{document}, and let KN\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$K_N$$\end{document} be the symmetric convex hull of Xi\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X_i$$\end{document}’s. We show that with high probability LKN≤Clog(2N/n)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ L_{K_N}\le C \sqrt{\log (2N/n)}$$\end{document}, where C\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C$$\end{document} is an absolute constant. This result closes the gap in known estimates in the range Cn≤N≤n1+δ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Cn\le N\le n^{1+\delta }$$\end{document}. Furthermore, we extend our estimates to the symmetric convex hulls of vectors y1X1,⋯,yNXN\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$y_1 X_1, \dots , y_N X_N$$\end{document}, where y=(y1,⋯,yN)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$y=(y_1, \dots , y_N)$$\end{document} is a vector in RN\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^N$$\end{document}. Finally, we discuss the case of a random vector y\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$y$$\end{document}.

convex bodies in R n the Euclidean ball is the one with the smallest isotropic constant, that is L K ≥ L B n 2 ≥ c, where c is an absolute positive constant. However, it is still an open problem to determine whether there exists or not an absolute constant C such that for every convex body K ⊂ R n one has L K ≤ C. The boundedness of L K by an absolute constant is equivalent to the long standing hyperplane conjecture [9]. The best general upper bound known up to now is L K ≤ Cn 1/4 [22]. This estimate slightly improves (by a logarithmic factor) the earlier Bourgain's upper bound [10].
Since the remarkable result of Gluskin [18] random polytopes are known to provide many examples of convex bodies (and related normed spaces) with a "pathologically bad" behavior of various parameters of a linear and geometric nature (we refer to the survey [27] and references therein; see also recent examples in [19] and [21]). Not surprisingly, they were also a natural candidate for a potential counterexample for the hyperplane conjecture. This was resolved in [23], where it was shown that the convex hull or the symmetric convex hull of independent Gaussian random vectors in R n with high probability has the bounded isotropic constant. Some other distributions for vertices were also considered. In all of them the vertices had independent coordinates.
Following the ideas in [23], the problem of estimating the isotropic constant of random polytopes was considered in [4], for independent random vectors distributed uniformly on the sphere S n−1 , and in [13], for independent random vectors uniformly distributed on an isotropic unconditional convex body. Also in these cases the isotropic constant of random polytopes generated by these vectors is bounded with high probability. One can check that the same method works for independent random vectors uniformly distributed on a ψ 2 isotropic convex body as well.
In this paper we estimate the isotropic constant of a random polytope in an isotropic convex body (see Sect. 2 for the definitions). It is known (see [6,20] or [5]) that if K N is a polytope in R n with N vertices then where C is an absolute constant. In [14,15], the authors provided a lower estimate for the volume of a random polytope K N obtained as the convex hull of N ≤ e √ n random points, namely (see the end of Sect. 2 for more precise formulation and details). On the other hand, the proof of the estimate L K N ≤ C log N in [5] passes through showing that if X 1 , . . . , X N are the vertices of K N , then for any affine transformation T we have Consequently, taking T to be the identity operator and using the concentration of measure result proved by Paouris [30], we obtain that if K N is the convex hull or the symmetric convex hull of N independent random vectors (for n + 1 ≤ N ≤ e √ n ) uniformly distributed on an isotropic convex body, then with high probability Notice that if N ≥ n 1+δ , δ ∈ (0, 1), this estimate does not exceed (C/δ) log N n . However, C/δ tends to infinity as δ tends to 0. On the other hand, if N is proportional to n the isotropic constant of K N is bounded (by an absolute constant), while the upper bound in (1.1) is not. The following theorem closes the gap between N ≤ cn and N ≥ n 1+δ . Theorem 1.1 There exist absolute positive constants c, C such that if n ≤ N , and X 1 , . . . , X N are independent random vectors uniformly distributed on an isotropic convex body K , and K N is their symmetric convex hull, then Furthermore, we study a natural more general family of perturbations of random polytopes. Namely, for any X 1 , . . . , X N ∈ R n and y ∈ R N , we define K N ,y = conv ± y 1 X 1 , . . . , ±y N X N .
As it turns out Theorem 1.1 is valid for this family as well. To describe this result we need more notation.
For a vector y = (y 1 , . . . , y N ) ∈ R N , let {y * 1 , . . . , y * N } be the sequence of decreasing rearrangement of {|y i |} i . For 1 ≤ k ≤ N and n ≤ m ≤ N denote Our main result is the following theorem. Remark 1 Note that if n ≤ N ≤ Cn then L K ≤ C for any symmetric polytope K generated by N vectors [6].
Finally, we apply Theorem 1.2 to the case when the vector describing the perturbation is also random. Such a setting has been recently considered in [7]. In Theorem 4.1 we show that for the Gaussian vector G = (g 1 , . . . , g N ) in R N with high probability we have n .

Preliminaries
In this paper the letters c, C, c 1 , C 1 , . . . will always denote absolute positive constants, whose values may change from line to line. Given two functions f and g we say that they are equivalent and write f ≈ g if c 1 f ≤ g ≤ c 2 f . By | · | and ·, · we denote the canonical Euclidean norm and the canonical inner product on R n . The (unit) Euclidean ball and sphere are denoted by B n 2 and S n−1 . Let K be a symmetric convex body in R n and let · K be its associated norm The support function of K is h K (y) = max x∈K x, y and it is the norm associated with the polar body of K , Given convex body K we denote by |K | its volume. We also denote by |E| the cardinality of a finite set E. For E ⊂ {1, . . . , N } the coordinate projection on R E is denoted by P E .
We say that a convex body K ⊆ R n is isotropic if it has volume |K | = 1, its center of mass is at 0 (i.e., K xdx = 0) and for every θ ∈ S n−1 one has where L K is a constant independent of θ . L K is called the isotropic constant of K . It is known that every convex body has a unique (up to an orthogonal transformation) affine image that is isotropic. This allows to define the isotropic constant of any convex body as the isotropic constant of its isotropic image. It is also known (see, e.g., [28]) that We need two more results on the distribution of Euclidean norms of random vectors and their sums. Let X i , i ≤ N , be independent random vectors uniformly distributed in an isotropic convex body K ⊂ R n . Let A be a random n × N matrix, whose columns are the X i 's. For m ≤ N denote (supp denotes the support of y). Theorem 3.13 in [1] (note the different normalization) implies the following estimate.

Theorem 2.1
There is an absolute positive constant C such that for every γ ≥ 1 and The following theorem is a combination of Paouris's theorem ( [30], see also [3] for a short proof) with the union bound (cf. Lemma 3.1 in [1]).

Theorem 2.2
There exists an absolute positive constant C such that for any N ≤ exp( √ n) and for every λ ≥ 1 one has Finally we need the estimate on the volume of the random polytope where X i , i ≤ N , are independent random vectors uniformly distributed in an isotropic convex body K ⊂ R n . The estimates of the following theorem were observed in [14] (see Fact 3.2, the remarks following it, and Fact 3.3 there; see also [32] and Chapter 11 of [11], where the assumption N ≥ Cn was reduced to N ≥ n).

Theorem 2.3
There are absolute positive constants c 1 , c 2 such that for n ≤ N ≤ e √ n , In fact this theorem is a combination of three results. The first says that K N contains the centroid body. Recall that for p ≥ 1 the p-centroid body Z p (K ) was introduced in [26] (with a different normalization) as the convex body, whose support function is In [15] (Theorem 1.1) the authors proved that for every parameters β ∈ (0, 1/2) and γ > 1 one has the inclusion K N ⊃ c 1 Z p (K ) for p = c 2 β log(2N /n) and N ≥ c 3 γ n with the probability greater than where A is the random matrix whose columns are X 1 , . . . , X N . The probability that norm of A (note A = A N ) is large was estimated in [1] (combine Theorems 2.1 and 2.2 above). Finally, from results of [24] and [30] the bound follows provided that p ≤ √ n (it improves the bound provided in [25]). In the Sect. 4 we will use the following standard estimate. For the sake of completeness we provide a proof (cf. Example 10 in [16]).

Lemma 2.4
There exists an absolute positive constant C such that for Cm ≤ N and independent standard Gaussian random variables g 1 , …, g N one has Using that As = N − m, this implies the result.

Proofs
In this section we prove Theorem 1.2. The proof consists of two propositions.
To prove this proposition we need the following lemma. Lemma 3.2 Let 1 ≤ n ≤ N be integers and P = conv{P 1 , . . . , P N } be a nondegenerated symmetric polytope in R n . Then where the supremum is taken over all subsets E ⊂ {1, . . . , N } of cardinality n.
Proof We can decompose P as a disjoint union of simplices (up to sets of measure 0), say P = ∪ i=1 C i , where each C i is of the form conv{0, P i 1 , . . . , P i n } for some choice of P i j 's. For every such C i , denote F i := conv{P i 1 , . . . , P i n }. Then for any integrable function f we have where ν(y) is the outer normal vector to P at the point y and d(0, F i ) is the distance from the origin to the affine subspace spanned by F i . Thus, as in [23], for every i ≤ one has In particular, Therefore, where the supremum is taken over all F = conv{P i 1 , . . . , P i n }. Note that any such F can be presented as F = T n−1 , where n−1 = conv{e 1 , . . . , e n } denotes the regular n − 1 dimensional simplex in R n and T is the matrix whose columns are the vectors P i j . Since where δ i j is the Kronecker delta, for every F = conv{P i 1 , . . . , P i n } we obtain This implies the desire estimate.
Proof of Proposition 3.1. Note that if y * n > 0 then the cardinality of the support of y is at least n, so K N ,y is not degenerated with probability one. Therefore, with probability one K N ,y is non-degenerated for any countable dense set in B 0 := {y ∈ R N | y n,2 ≤ 1, y * n > 0}. Clearly, the supremum under question is the same over y ∈ B 0 and over such a dense set. ⎠ (formally, we should additionally take supremum over ε i = ±1 and to have y i ε i X i in the formula under suprema, but, since B 0 is unconditional, the supremum over ε i 's can be omitted). Note that

Now, by Lemma 3.2 we have that sup
where A is the matrix whose columns are X 1 , . . . , X N . Therefore, applying Theorems 2.1 and 2.2 (with m = n and λ = 2 log(2N /n)) we obtain that with probability greater than 1 − exp(− √ n log(2N /n)).

Proposition 3.3
There exist absolute positive constants c 1 , c 2 such that if n ≤ N ≤ e √ n and X 1 , . . . , X N are independent random vectors distributed uniformly on an isotropic convex body K , then for every y ∈ R N , Moreover, the event The probability estimates in Proposition 3.3 are based on an estimate of corresponding probability for a fixed y and the union bound. We start with the following lemma.

Lemma 3.4
There exist absolute positive constants c 1 , c 2 such that the following holds. Let n ≤ m ≤ N ≤ e √ n and X 1 , . . . , X N are independent random vectors distributed uniformly on an isotropic convex body K . Then for every y ∈ R N with y * m > 0 there exists v = v(y) ∈ R N having 0/1 coordinates with exactly m ones such that Proof Fix y ∈ R N with y * m > 0 (i.e., |supp y| ≥ m). Let i 1 , . . . , i m be the indices such that y i j = y * j and let v = v(y) ∈ R N be the vector with v k = 1 if k = i j and 0 otherwise. Decompose the polytope K N ,v into a disjoint union of simplices (up to a set of zero measure) where C k = conv{0, ε k 1 X k 1 , . . . , ε k n X k n } for some ε k j = ±1 and some vectors X k j , given by the simplicial decomposition of the facets of K N ,v . Denote Clearly, C k,y 's are also disjoint up to a set of zero measure and |C k | = | det ε k 1 X k 1 , . . . , ε k n X k n | |conv{0, e 1 , . . . , e n }| ≤ 1 α n y,m | det ε k 1 |y k 1 |X k 1 , . . . , ε k n |y k n |X k n ||conv{0, e 1 , . . . , e n }|.

This implies
This proves the first estimate. The second one follows by Theorem 2.3, since K N ,v is a symmetric random polytope in an isotropic convex body generated by m ≥ n random points. To prove the second bound note that the set {v(y)} y∈R N (v(y) is from Lemma 3.4) has cardinality N m and that denoting provided that k ≤ c √ N / log N . Lemma 3.4 and the union bound imply The result follows by the union bound and (3.1).
For N ≥ e √ n the theorem follows from the general estimate L K ≤ Cn 1/4 for any n-dimensional convex body [22].

Random Perturbations of Random Polytopes
In this section G = (g 1 , . . . , g N ) denotes a standard Gaussian random vector in R N , independent of any other random variables. In the following theorem, which is a consequence of Theorem 1.2 and Lemma 2.4, we estimate L K N ,G . Theorem 4.1 Let n ≤ N . Let X 1 , . . . , X N be independent copies of a random vector uniformly distributed on an isotropic convex body. Then where c 1 and c 2 are absolute positive constants.
Proof Without loss of generality we assume that N ≥ Cn for a sufficiently large absolute constant (see Remark 1 following Theorem 1.2). It is well-known (and can be directly calculated) that for the Gaussian vector G = (g 1 , . . . , g N ) one has E G n,2 ≈ n log N n .
Since α G,m ≥ g * m , Theorem 1.2 with m = √ n N in the infimum implies

Concluding Remarks
In this section we show that under additional (strong) assumption that random vectors are ψ 2 vectors, the polytope K N ,G contains Z p for an appropriate p (recall that G is a standard Gaussian random vector in R N and the Z p body was defined in (2.2)). We first recall the definition of ψ α norm. For a real random variable z and α ∈ [1, 2] we define the ψ α -norm by It is well known (see, e.g., [12]) that the condition z ψ α ≤ c 1 is equivalent to the condition ∀ p > 1 : E|z| p 1/ p ≤ c 2 p 1/α E|z|. Let X be a centered random vector in R n and α > 0. We say that X is ψ α or a ψ α vector, if X ψ α := sup y∈S n−1 X, y ψ α < ∞.
We also denote Proposition 5.1 There are absolute positive constants C, c 1 and c 2 such that the following holds. Let β > 2 and N ≥ C(log β) 2 n log(3n)/(log log(3n)). Let X 1 , . . . , X N be independent copies of a random vector uniformly distributed on an isotropic convex body K ⊂ R n . Assume that X 1 ψ 2 ≤ β L K log N n . Then for N ≥ (n/ log(3n)) 2 and α = (N /n) −c 2 N /n otherwise.

Remark 1
We would like to note that for N ≥ n 2 this Proposition (with slightly worse probability) was proved in [7] (see Lemma 4 there) without any assumptions on ψ 2 -norm. Our proof is very similar to the one given in [7]. We provide it for the sake of completeness. The main new ingredient is the following lemma, which is needed to estimate the norm of matrix A in the proof of Proposition 5.1.

Lemma 5.2
Let g be a standard Gaussian random variable. Let X be uniformly distributed on an isotropic convex body K ⊂ R n . Assume that X is a ψ 2 random vector and that X ψ 2 = ψ L K . Then for every p ≥ 1, where C is an absolute positive constant.
Proof We use the following form of Paouris's theorem, which may be deduced directly from Paouris's work [30] (as formulated here it first appeared as Theorem 2 in [2], a short proof was given in [3]) Thus, using assumptions of X , we obtain which implies the result.
Proof of Proposition 5.1. First note that g i X i , i ≤ N , have ψ 1 norm bounded by c 1 β L K log(N /n). Indeed, for any p ≥ 1 and θ ∈ S n−1 , one has Denote by A the n × N random matrix whose columns are the vectors g i X i . By Theorem 3.13 in [1] (cf. Theorem 2.1), Now we estimate max i≤N |g i X i |. If N ≥ (n/ ln(3n)) 2 we choose p = 4 √ N , otherwise p = 4(N /n) ln(N /n). In both cases, Then by Lemma 5.2 and the Chebyshev inequality we have for every i ≤ N , Union bound implies, due to conditions on N and n. This implies where α 0 = e − √ N for N ≥ (n/ ln(3n)) 2 and α 0 = exp(−(N /n) ln(N /n)) otherwise. On the other hand, for every σ ⊆ {1, . . . , N }, q ≥ 1 and θ ∈ S n−1 , by the Paley-Zygmund inequality, Since γ p ≈ √ p, and from Borell's lemma ( [8], see also Appendix III in [29]), the quantity above is bounded by Let σ 1 , . . . , σ n be a partition of {1, . . . , N } with m ≤ |σ i | for every i and · 0 be the norm Note that · 0 ≤ n −1/2 | · |. Since for all 1 ≤ i ≤ n and every z ∈ R n where A t is the transpose of A.
By isotropicity we have that (E| X 1 , z | q ) 1 q ≥ L k |z| (because we have chosen q = (1/2) log(N /n) > 2). Thus, assuming A t ≤ c 5 β L K N log N n , we have Therefore, if u ∈ U approximates z ∈ S, that is if 1 2 γ q (E| X 1 , z − u | q ) 1 q ≤ δ, then u also satisfies A t u 0 ≤ A t z 0 + c 7 β N n δ.