Replacing Sample Trimming with Boundary Correction in. Nonparametric Estimation of First-Price Auctions

Size: px
Start display at page:

Download "Replacing Sample Trimming with Boundary Correction in. Nonparametric Estimation of First-Price Auctions"

Transcription

1 Replacing Sample Trimming with Boundary Correction in Nonparametric Estimation of First-Price Auctions Brent R. Hickman Timothy P. Hubbard December 8, 3 Abstract Two-step nonparametric estimators have become standard in empirical auctions. A drawback concerns boundary effects which cause inconsistencies near the endpoints of the support and bias in finite samples. To cope, sample trimming is typically used which leads to non-random data loss. Monte Carlo experiments show this leads to poor performance near the support boundaries and on the interior due bandwidth-selection issues. We propose a modification which employs boundary-correction techniques, demonstrating substantial improvement in finite-sample performance. We implement the new estimator using oil lease auctions data and find that trimming masks a substantial degree of bidder asymmetry and inefficiency in allocations. Keywords: Auctions, nonparametric estimation, kernel smoothing, boundary correction. JEL subject classification: C4, D44. We thank participants at the th Annual International Industrial Organization Conference as well as the North American Summer Meeting of the Econometric Society for helpful suggestions. Our work has also benefited from discussions with Isabelle Perrigne and Quang Vuong as well as comments from three anonymous referees and the editor, M. Hashem Pesaran. Lastly, we appreciate the excellent research assistance of Ian Fillmore. Hickman: Department of Economics, University of Chicago, 6 E 59th Street, Chicago, IL 6637; hickmanbr@uchicago.edu; phone: (773) ; fax: (773) Hubbard: Department of Economics, Colby College, 54 Mayflower Hill Drive, Waterville, ME 49; timothy.hubbard@colby.edu; phone: (7) ; fax (7)

2 Introduction The empirical auctions literature is one of the leading success stories for structural research in economics. It combines game theory and bidding data to shed light on a variety of policy-relevant questions, such as revenue maximization (cost minimization in procurement), efficiency, and information rents. Structural econometric techniques in auctions have been applied to an array of market settings such as timber, oil and mineral leases, online auctions, government-backed securities, procurement of taxpayer-funded services, and even college admissions (see Hickman, Hubbard and Sağlam []). Guerre, Perrigne and Vuong [, henceforth, GPV] made a landmark contribution by developing a fully nonparametric estimator for first-price auctions. GPV overcame two principal deficiencies in the previous literature: first, a priori restrictions on the distribution of private information; and, second, computational burden associated with parametric methods that require repeatedly solving for equilibrium strategies. The idea behind the estimator is simple, yet powerful. Bidders in a Bayesian game can view their opponents as a composite rival playing a mixed strategy equal to the equilibrium bid distribution. Therefore, their own bids form best responses to this mixed strategy, given their private values. The beauty of this observation is that bids and bid distributions are observable, allowing for private values to be inferred directly from bidding data. Estimation under GPV proceeds in two stages: first, the inverse equilibrium mapping is empirically reconstructed using nonparametric estimates of the distribution and density of bids. For each observed bid, this results in a pseudo value, or an estimate of the private value that motivated the bid as a best response. In the second stage, pseudo private values are used to nonparametrically estimate the latent valuation distribution. GPV is now standard practice in empirical auctions research. Although the original estimator was developed for single-unit, first-price, sealed-bid auctions with symmetric risk-neutral bidders and independent private values, it has been extended to more general settings, including affiliated private values (Li, Perrigne and Vuong [, ]), bidder asymmetry (Campo, Perrigne and Vuong [3]; Flambard and Perrigne [6]), risk aversion (Guerre, Perrigne and Vuong [9]; Campo, Guerre, Perrigne and Vuong []), secret reserve prices (Li and Perrigne [3]), and unobserved, auctionspecific heterogeneity (Krasnokutskaya []; Athey, Levin and Seira []). The GPV method is also the basis for nonparametric models of more complex auction mechanisms; e.g., divisible goods as in Hortaçsu and McAdams []. Despite its strengths, the approach suffers some drawbacks which stem from kernel density estimators (KDEs). KDEs applied to distributions with a compact support are known to be inconsistent

3 at the endpoints, a phenomenon known as boundary effects. As a consequence, KDEs also exhibit excessive bias near the maximum and minimum observations in finite samples. This is because an estimate at a given point relies on data within a neighborhood whose width is defined by the bandwidth parameter, call it h. For points within distance h of the support boundary, part of the neighborhood is necessarily empty, since observations outside the support are impossible. Thus, standard KDEs naturally penalize estimates near the boundary of the support downward. For this reason, these estimators are only uniformly consistent on closed subsets of the interior of the distributional support. Because boundary effects, GPV requires an adjustment known as sample trimming. Letting h b denote the bandwidth based on a sample of bids, trimming is the practice of discarding first-stage pseudo values based on KDEs within an h b -neighborhood of the sample extrema. Moreover, the problem is compounded in the second stage when the trimmed sample of pseudo values is used to produce a KDE of the private value distribution. Letting h v denote the second-stage bandwidth, within an h v -neighborhood of the extremal surviving pseudo values, estimation is also problematic. Thus, there is a subset of the true support (determined by h b and h v ), on which GPV exhibits a downward bias due to the compounding of boundary effects across both estimation stages. A more subtle problem also arises from sample trimming: standard bandwidth selection rules become problematic in both stages of estimation. The first-stage bandwidth h b controls both the mean integrated squared error (MISE) of the bid density estimate f B and the amount of data loss prior to the second stage. Therefore, myopically choosing h b with only f B in mind as standard bandwidth selection rules do will not appropriately minimize the MISE of the second-stage density estimate f V. In the second stage, trimming leads to underestimation of the optimal bandwidth (a function of the variance of private values), because the set of trimmed pseudo values has extreme observations truncated. To our knowledge, this is the first paper to point out the complications in bandwidth selection arising from sample trimming for multi-stage estimators. This is important because bandwidth choice is known to be crucial for determining the statistical properties of KDEs. To address these problems, we propose a modified GPV estimator, which replaces the standard KDE and sample trimming with a boundary-corrected KDE developed by Zhang, Karunamuni and Jones [999] and improved upon by Karunamuni and Zhang [8, henceforth, KZ]. We refer to this modified version as BCGPV (for boundary-corrected GPV ). For private-value distributions on compact supports having strictly positive densities two assumptions on which GPV is founded the KZ density estimator achieves uniform consistency on the entire support, at the same rate as standard

4 KDEs on the support interior. This leads to substantial improvement in finite-sample performance of the two-stage estimator. First, since BCGPV does not discard any data, one can simply choose h b and h v according to standard rules which minimize MISE within each stage. Second, a Monte Carlo study demonstrates that BCGPV provides the researcher with access to a considerably larger portion of the support of private values within feasible sample sizes. Third, BCGPV performs significantly better than standard GPV, as measured by MISE of the private-value density f V, with the improvement coming both near the endpoints (due to boundary correction and avoidance of data loss), as well as within the interior of the support (due to improved bandwidth estimation). There is also a practical advantage to boundary correction; it is easily portable to any setting in which GPV has been extended. We use boundary correction to revisit an application considered by Campo, Perrigne and Vuong [3] and estimate a model of asymmetric bidders with affiliated private values using data on oil lease auctions. In multi-dimensional settings such as this one, boundary effects become more pronounced due to the fact that optimal bandwidth sizes are increasing in the dimension of the density, holding sample size fixed. We show how boundary correcting rather than sample trimming can significantly alter the economic interpretation of the data: the former uncovers a substantial degree of bidder asymmetry, whereas the latter masks it. Asymmetric auctions are known to admit inefficient allocations, and in the case of oil leases, the US Department of the Interior introduces asymmetry by allowing firms to bid jointly. A relevant policy question concerns the magnitude of the induced inefficiencies. The trimming-based estimator implies zero deadweight loss, since all auctions with inefficient allocations are stripped from the sample. Boundary-corrected results in an estimated $36.8 million of deadweight loss, or 3% of total revenue across all observed asymmetric auctions. This paper has the following structure: Section discusses the related literature. Section 3 describes the original GPV estimator with sample trimming, as well as our proposal for a boundary-corrected GPV estimator. Section 4 presents a Monte Carlo study to compare the two estimators finite-sample performance. In Section 5 we illustrate the modified BCGPV estimator in an empirical application to oil lease auctions data. Section 6 concludes. Related Literature Here we give a brief overview of sample trimming alternatives proposed in the auctions literature. We then give a brief overview of the statistics literature on boundary correction which forms the basis for our proposed modification to the GPV method. 3

5 . Sample Trimming Alternatives in the Empirical Auctions Literature Marmer and Shneyerov [] propose a quantile-based estimator which avoids explicit trimming because it directly maps the quantiles of the bid distribution into the quantiles of the private value distribution. However, their estimator is still subject to boundary effects because it uses a standard KDE and therefore cannot consistently estimate the th and th percentiles. Some recent examples in the literature have demonstrated a need for this capability, including Campo, Guerre, Perrigne and Vuong [] and Hickman []. Moreover, Marmer and Shneyerov require independent private values, whereas boundary correction can be easily adapted to more general settings. Campo et al. [] propose using a one-sided kernel function with the endpoint positioned over b. Their proposal, however, does not include a method for correcting small-sample bias within the sets (b, b + h b ) and (b h b, b). It also remains unclear how a one-sided kernel would be optimally oriented over interior points close to the boundary. Boundary correction eliminates both of these problems. Of course, there is nothing inherent in the GPV method that specifically requires a KDE as the nonparametric density estimator of choice. Other options exist which are not subject to boundary effects, but which could, in principle, be used within a two-stage estimator; e.g., sieve density estimators such as splines, local polynomials, or (global) orthogonal polynomials (see Chen [7] for a survey). However, in the first-price auctions context, existing asymptotic theory for two-stage estimators was developed assuming use of a KDE (see, e.g., Guerre et al. [] or Campo et al. [3]). Incorporating another density estimation technique into GPV would therefore require derivation of rules for selection of optimal support partitions (for splines or local polynomials) or polynomial orders (for global polynomials) as functions of data that would (hopefully) achieve the optimal nonparametric rate derived by GPV. While this is an interesting question, it is beyond the ambition of the current exercise. We contribute to the previous literature on nonparametric estimation of first-price auctions in three ways. First, we provide an extensive Monte Carlo investigation of the small-sample performance of twostage estimators for first-price auctions. We document various subtle and often overlooked problems stemming from sample trimming. Second, we propose a simple solution that is easily portable to more general settings. Our proposal replaces trimming in estimation of first-price auctions with an alternative that is both asymptotically unbiased at the boundaries and well-defined at interior points. Finally, we demonstrate the practical relevance of boundary correction in guiding economic policy through an empirical application to oil lease auctions. Campo et al. needed a reliable estimate of the bid density specifically at b in order to estimate bidder risk aversion. 4

6 . Boundary Effects and Boundary Correction Let f denote a density function with support [, ) and consider nonparametric estimation based on a random sample Z T = {Z, Z,... Z T }, where Z t f, t. The standard KDE is defined by f(x) = T T h t= κ ( ) x Z t h, where κ is a symmetric unimodal kernel satisfying κ(u) du = and κ(u) = κ( u) u, and h is a bandwidth parameter chosen to approach zero at a rate no faster than O ( T ). One drawback is that f underestimates f on the set [, h) for an intuitive reason: it cannot detect data outside the boundaries of the support, so it naturally penalizes the density estimate within an h-neighborhood of. In fact, the point estimate f() is known to be inconsistent, a problem that is known as the boundary effect. Because of this problem, standard KDEs are uniformly consistent only on closed interior subsets of the support, and in finite samples they perform poorly; see, for example, Silverman [986, Fig..9] and Jones [993, Fig. ]. Coping techniques have been developed, including the reflection method and the transformation method. The former involves artificially reflecting some data near a boundary outside the support, resulting in f R (x) = T { ( T h t= κ x Zt ) ( h + κ x+zt )} h, x [, h). Transformation methods map the data onto an unbounded support to get f T (x) = T h ( ) T t= κ x τ(zt) h, τ : [, ) R, τ (z) > z. While both approaches reduce bias due to boundary effects, they come at a cost of increased variance near the endpoints. Zhang et al. [999] improved on previous proposals with a hybrid reflectiontransformation estimator. KZ then modified Zhang et al. to achieve improved asymptotic performance. Formally, the boundary-corrected KZ density estimator, denoted f BC, is f BC (x; Z T ) = T h T κ(x, Z t ; h), () t= where ( ) x Zt κ(x, Z t ; h) κ + κ h ( x + τ(zt ) h ), τ(y) = y + dy + A d y 3, d = [log (φ T (h, Z T )) log (ψ T (h, Z T ))] /h, φ T (h, Z T ) = T + T κ T h t= { ψ T (h, Z T ) = max T, T h ( h Z t T t= h ), ( ) } Zt κ, h () 5

7 where κ is a symmetric kernel with support [, ]; A > 3 ; h = o(h); κ : [, ] R is an optimal { ( endpoint kernel given by κ (y) = 6+8y+y ; and h = ηh, with η = x κ(x)dx) ( κ(x)dx) } /5 ( x κ (x)dx) (. κ (x)dx) The main bandwidth h is chosen according to Silverman s (986) MISE-minimizing rule applied to the primary kernel function κ. KZ establish the result that this estimator is uniformly consistent, which we reproduce here for the reader s reference: KZ Theorem.: Suppose density f has support [, ), is strictly bounded away from zero, and has a continuous second derivative f within a neighborhood of, and consider f BC as defined above with h = O(T /5 ). Then for x = αh, α, f BC has bias and variance given by { [ ] E fbc (x; Z T ) f(x) = h f () t κ(t)dt 6(A ) (f ()) f() α (t α) κ(t)dt } + o(h ) (3) [ ] and V fbc (x; Z T ) = f() { α } (κ(t)) dt + κ(t)κ( t)dt + o ( (T h) ). (4) T h Structural work in auctions is typically based on the assumption of a compact support, so a slight adaptation is in order. KZ show that on the region [h, ) the term involving τ on the first line of equation () drops out, meaning the estimator reduces to the standard KDE. Thus, adapting f BC : [, ) R + to an arbitrary compact support [z, z] is straightforward for any h < (z z)/: let f BC (x z; Z T ) if x [z, z h] f(x) f BC ( x + z; Z T ) if x [z h, z], (5) where Z T = {Z z, Z z,..., Z T z} and Z T = { Z + z, Z + z,..., Z T + z}. The data transformations which define f allow KZ s logic to be applied separately to each of the disjoint extremal subsets [z, z+h] and [z h, z] to deliver uniform consistency. Specifically, by KZ Theorem., applying f BC to the transformed data Z T delivers uniform consistency on the former interval and applying f BC to the transformed data Z T delivers uniform consistency on the latter interval. Therefore, it immediately follows from KZ that f is uniformly consistent on the entire support [z, z], including at the endpoints. Note also that in equations (3) and (4), if the second term within the curly brackets involving α is removed, then (ignoring the higher-order terms) we have the same bias and variance as a standard KDE on the interior of its support (see Silverman [986]). Since those terms are bounded for α [, ] this implies that f is uniformly consistent at the same rate of convergence as standard KDEs. Although 6

8 for finite T the bias and variance are larger within a neighborhood of the endpoints, in large samples they differ from the interior by a multiplicative constant... Practical Issues There are some free parameters which must be pinned down in order to implement KZ. With the exception of the main bandwidth, h, these choices have little effect, within certain bounds. A is generally inconsequential so long as A > 3 ; in implementation we will select A =.55, as suggested by KZ. KZ Theorem. requires a kernel with compact support; therefore we use the triweight kernel, ( κ(u) = ) 35 3 u 3 { u }, where { } is an indicator function. We recommend selecting the primary bandwidth value h via Silverman s (986) optimal global bandwidth rule h = T 5 κ (x)dx ( ) z x κ(x)dx z ( d f ) dx dx /5, (6) where the denominator integral ( ) z d f z dx dx is substituted by 3 8 π σ 5, and where σ is the sample standard deviation. 3 Finally, there are many ways to choose h = o(h); we set h = ht as proposed by KZ. 3 Nonparametric Estimation of First-Price Auctions In this section, we outline the standard GPV method, paying special attention to the implications of sample trimming. We then describe our BCGPV proposal. Consider an environment with N bidders competing for a single object in a first-price, sealed-bid auction. Each bidder independently draws a private valuation for the object from a commonly-known distribution F V (v) on a compact support [v, v]. We also assume that the private value distribution has a continuous density which is strictly This hinges on the fact that the trailing terms in (3) and (4) are of order little-o( ), meaning they approach zero at a faster rate than the argument. KZ adapt the estimator of Zhang et al. [999], where the bias and variance have trailing terms of order O(h ) and O ( (T h) ), respectively. While seemingly marginal, this is important in the two-stage estimation setting, as it means that the asymptotic behavior near the boundaries and in the interior are the same. Another reason why we chose to concentrate on the KZ boundary correction method is that its assumptions resemble those required for nonparametric estimation in first-price auctions. GPV requires that the private value density (and, hence, the bid density as well) is bounded away from zero, and Zhang et al. [999] (on which KZ is based) tends to perform well in finite samples relative to other boundary-corrected estimators when the density is strictly positive at the boundary; see Karunamuni and Alberts [5] for a comparison of various distinct boundary correction methods. 3 The denominator of (6) contains an expression involving the true density f. Given that this is unknown, we follow Silverman [986, equations (3.7) (3.8)] who suggests replacing f in (6) with a standard family of distributions; namely, the normal. This substitution has become known as Silverman s rule of thumb and is now standard practice in applications; see Härdle [99], Guerre et al. [], and Campo et al. [3], among many others. 7

9 positive on the closure of the support. In addition, we assume that F V has a continuous second derivative within a neighborhood of v and v. Denote a bid by b, let the equilibrium bid function be denoted β(v) = b with inverse ξ(b) = v, and the equilibrium bid distribution be denoted F B (b), with density f B (b). For a sample of L auctions all bids are observed, providing the econometrician with T = NL independent bids {b t } T t=, with the maximal and minimal observed bids being denoted b max and b min, respectively. For ease of exposition we consider only the case where auctioned items are identical, N is fixed across auctions, and the seller has no reserve price. It is straightforward to relax these additional restrictions, but such changes do not affect the comparison between trimming and boundary correction so they are ignored here. The Bayes Nash equilibrium in this model is described by β (v) = [v β(v)] (N ) f V (v) F V (v), (7) together with boundary condition β(v) = v. In order to identify and estimate the model, GPV used the fact that auction theory implies monotonicity of β to transform the above differential equation. 3. Standard GPV with Sample Trimming Since β is strictly increasing, F B (b) = F V [ξ(b)] = F V (v), and f B (b) = f V [ξ(b)] ξ (b) = f V [ξ(b)] β [ξ(b)]. Substituting these terms into (7), we get v = b + F B (b) ξ(b) (8) (N ) f B (b) or in other words, private values can be decomposed as the sum of bids and a markdown factor which depends on the bid distribution and the number of bidders. The beauty of this discovery comes from the right-hand side of the equation being identified and estimable nonparametrically from bidding data alone. The econometrician can directly recover private values which motivated observed bids as best responses to the bid distribution, so we have the following two-stage estimation strategy: Stage : For each b t, estimate a pseudo value v t = b t + F B (b t) (N ) f = ξ(b t ), where F B is the B (b t) empirical distribution of observed bids, and f B is a KDE estimated with bandwidth h b. Intermediate Step (Sample Trimming): Re-define v t = for each v t = ξ(b t ) such that b t [b min, b min + h b ] [b max h b, b max ]. 8

10 Stage : Estimate the private value density by f V (x) = T T h v t= κ ( ) x v t h, where h v is an v optimally chosen bandwidth. Note the intermediate step involving sample trimming; this is due to boundary effects from kernel density estimation in Stage. In practice, f B exhibits substantial bias within a distance of h b of the sample extrema, so the standard coping strategy is to discard corresponding pseudo values from the sample, so as to avoid contaminating estimation in the second stage Implications of Sample Trimming Although sample trimming addresses the excessive bias propagating to the second stage, it creates three new problems as well. The first and most obvious is that sample trimming involves a loss of data. Let v min min{ v t v t < } and v max max{ v t v t < } denote, respectively, the minimum and maximum of the untrimmed pseudo private values, and note that nonparametric inference on the complement of [ v min, v max ] is impossible despite the existence of bid data outside the image of this region under the mapping β( ). Moreover, f T t= V will integrate to { vt< } T <. The second problem is that sample trimming compounds boundary effects across both stages. Not only is there an absence of data beyond the endpoints of the private value support, but there are also no data within a neighborhood of the endpoints as well. This causes the biased region to be larger than if the econometrician could directly observe private values. As GPV pointed out, fv will be biased ) ( ] downward on the set [v, ξ(b min + h b ) + h v ξ(b max h b ) h v, v. It will be convenient at times to refer to the complement of this set as Ω [v, v] \, being the interior region on which the standard GPV estimator performs best. For the supremum (infimum) of the lower (upper) interval comprising, the term involving ξ accounts for the region where sample trimming has eliminated data, and the h v term accounts for the contribution of the second-stage boundary effect. The third problem is a bit more subtle, though perhaps most salient: sample trimming greatly complicates the task of optimal bandwidth selection. In kernel density estimation, there are two degrees of freedom for the researcher: choice of the kernel function κ, and choice of the bandwidth parameter, h. It is well known that the former is of little consequence, but the latter plays a crucial role in determining the statistical properties of the estimator. Small h leads to under-smoothing for very small values the KDE will have a local mode centered over each datum and large h leads to over-smoothing for very large values the KDE is essentially uniform. In practice, researchers 4 Note that the full sample size T is used in Stage. Otherwise, if the trimmed sample size were used, the result would not estimate the private value density, but rather, the density conditional on being outside the trimmed region. 9

11 typically resolve bandwidth selection by minimizing MISE. Silverman [986] showed that for a given random sample {x t } T t= with X t f for each t, if κ is a unimodal, symmetric density then the minimal MISE criterion results in the following rule, which has since been widely implemented in structural auctions research: h = c κ σt /5, (9) where σ is the standard deviation of X t under f, T is the sample size, and c κ is a kernel-specific constant. 5 In practice, σ has been estimated using the sample standard deviation σ. When it comes to two-stage nonparametric estimation of the private value density f V, sample trimming complicates bandwidth selection in both stages. In the second stage, estimation of σ = σ v in equation (9) is problematic because the set of trimmed pseudo values is a non-random sample. Using the sample standard deviation of trimmed pseudo values will underestimate the population standard deviation (and the optimal bandwidth by extension) because extreme observations have been systematically eliminated from the sample. Although (9) may be optimal within the second stage, in finite samples σ v is effectively a nuisance parameter because using the sample analog based on the pseudo values that survive trimming is no longer appropriate. Bandwidth selection in the first stage is hampered at a more basic level because myopically minimizing the MISE of f B need not be optimal from the standpoint of minimal MISE for f V. 6 Intuitively, when selecting an optimal value for h b, sample trimming creates a trade-off between minimal MISE of f B which is desirable in order to minimize second-stage MISE on the set Ω and minimizing the size of the outer region, a function of h b, where negative bias results from data loss. While the former goal is achieved by choosing the first-stage bandwidth according to (9), the second goal is left unadressed, as (9) was derived with only one-stage KDEs in mind. For this reason, the second-stage optimal choice of h b will tend to be lower than that prescribed by Silverman s rule when trimming is employed. One natural remedy might be to derive new bandwidth selection rules for the first and second stages which take sample trimming into account, though this would not address the problems of data loss and excessive bias in the region. However, instead we propose an alternate approach 5 Derivation of an approximate form of the optimal kernel-specific constant c κ is due to Silverman [986]: /5 π c κ κ (x)dx ( 3 8 κ(x)dx) x and Härdle [99] later tabulated values of c κ for various commonly used kernel functions. 6 Indeed, our Monte Carlo study in Section 4 serves as a set of counterexamples to demonstrate that choosing h b by equation (9) is generally suboptimal from the perspective of the second stage: reducing first-stage bandwidth by nearly one-third relative to (9) can dramatically decrease second-stage MISE.

12 that eliminates altogether the need for such an adjustment by replacing sample trimming (and the complications that go with it) by boundary correction. Finally, note that sample trimming in GPV and related estimators is typically justified with the argument that the problems it causes disappear as T. One concern from a practical standpoint is that all of the issues discussed above depend on the size of the bandwidth parameters, h b and h v, and these vanish very slowly, being O(T /5 ). For context, if a researcher wishes to decrease bandwidths by half, she must increase her sample size by a factor of 5 = 3; to decrease them by an order of magnitude, she must increase her sample size by a factor of 5! With sample sizes of practical relevance, the above problems may be severe, as we document in a Monte Carlo study in Section A Boundary-Corrected GPV Estimator One difficulty in estimation of first-price auctions is that the boundaries of the bid distribution, b and b, are functions of the private value distribution, meaning that they are ex ante unknown to the econometrician. Knowledge of the boundaries is obviously important in order to perform boundary correction. Fortunately though, good estimators are readily available: b b min and b b max. GPV [Proposition ] showed that the sample extrema are consistent estimators of the bid support bounds, converging at a relatively fast rate of log T T.7 With that in mind, it will be necessary to define some additional notation before outlining the modified version of the GPV estimator. For a given sample of data {z t } T t=, define f(x) as estimator () applied to {z t } T t=, where z t = z t z min for each t, and define f(x) as estimator () applied to {z t } T t=, where z t = z t + z max for each t. Boundary-corrected GPV estimation, or BCGPV, proceeds in the following two stages: Stage : For each observed bid b t, estimate a pseudo private value ṽ t via ṽ t = b t + (N ) F B (b t) f B (b t;h b ) ξ(b, N), where F B is the empirical distribution of bids {b t } T t=, f B is the boundary-corrected KDE for bids defined as in equation 5, and h b is an appropriately chosen bandwidth. Stage : Let ṽ min min{ṽ t } T t= and ṽ max max{ v t } T t= denote, respectively, the minimum and maximum of the boundary-corrected pseudo values. Then the BCGPV estimator of the private value density is f V (x; h v ) from equation (5) applied to the sample of boundary-corrected pseudo values {ṽ t } T t= and h v is an appropriately chosen bandwidth. 7 For a discussion of the asymptotic distributions of these estimators, see Arnold, Balakrishnan and Nagaraja [8, Theorems 8.3., 8.3.4, 8.3.5, 8.3.6].

13 3.. Asymptotic Properties Guerre et al. [, Theorem 3] showed the standard estimator to be uniformly consistent at the optimal nonparametric rate within closed interior subsets of the private value support. Since this result establishes uniform convergence and asymptotic efficiency on sets measuring arbitrarily close to one, there is little room for improvement in the asymptotic theory. Indeed, as we argue below in a Monte Carlo study and in an empirical application, the primary benefits to replacing sample trimming with boundary correction in applied work arise from finite-sample properties of the estimator: for sample sizes of practical relevance, it significantly reduces bias and variance and allows the researcher to draw nonparametric inferences based on a larger portion of the private value support. In this section we briefly explain why the argument behind GPV s optimal convergence result is still appropriate when replacing sample trimming with standard KDEs by boundary correction. Maintaining various original assumptions of GPV, suppose that V is independently and identically distributed as F V ( ) on known support [v, v] (GPV Assumption A); having density f V c v > v (GPV Assumption A (ii)); moreover, let F V admit up to R + continuous bounded derivatives (GPV Assumption A (iii)). 8 Together these imply some regularity conditions on equilibrium objects arising from the model. 9 Finally, we also maintain GPV s assumption that the (primary) bandwidths used for kernel density estimation are of the form h b = λ b ( log T T Assumption A4 (ii)). ) R+3 and h v = λ v ( log T T ) R+3 (GPV Now, rather than assume a KDE with a standard kernel function κ (as in GPV Assumption A3), consider a boundary-corrected kernel function κ satisfying the following: (i) κ follows definition (); (ii) κ( ) is a symmetric kernel with support [, ]; (iii) A > 3 ; h = o(h); (iv) κ : [, ] R is an optimal boundary kernel, given by κ (y) = 6 + 8y + y ; and { ( (v) h = ηh, with η = x κ(x)dx) ( κ(x)dx) } /5 ( x κ (x)dx) (. κ (x)dx) 8 Again, we omit discussion of assumptions concerning covariates and varying numbers of bidders, in order to simplify notation and concentrate on the comparison between sample trimming and boundary correction. These additional assumptions, outlined in detail by GPV, would remain unchanged if the more general version of the model were considered. 9 Namely, the bid support has a positive finite length; b = v; the bid density exists, with f B c b > for b [b, b]; and f B admits up to R + continuous bounded derivatives on its support (see GPV Proposition ).

14 These ensure good behavior of the one-stage KDEs that will make up the two-stage BCGPV estimator. Together they replace GPV Assumption A3 which established regularity conditions for standard kernel functions that allowed them to invoke asymptotic results proven by Härdle [99] on closed inner subsets of the support. Specifically, Härdle [99] showed that the bias of the standard KDE f is proportional to h and its variance is proportional to (ht ). However, Härdle s result is only valid on the interior set [z + h, z h] where boundary effects are not present. This is why GPV s original proof based on their assumptions A-A4 is confined to demonstrating uniform convergence on interior compact subsets of the private value support. Recall from above that KZ Theorem. established these same results within an h-neighborhood of the support endpoints for κ satisfying the above conditions. Therefore, the boundary-corrected KDE f has bias proportional to h and variance proportional to (ht ) on the entire support. From this fact it follows straightforwardly that if one replaces the reference to Härdle [99] with a reference to Karunamuni and Zhang [8] instead, the original proof of asymptotic properties established in GPV Theorem 3 still hold for BCGPV on any compact subset of the support; namely, it is uniformly consistent at the optimal nonparametric rate, which GPV showed to be (log T/T ) R+3. Note that, as GPV argued, their Theorem 3 is useful in practice only if the support [v, v] is known so that one can choose appropriately the compact subset, or the reference points of the boundary correction procedure. This support can be estimated. Specifically, because v = b, the lower bound v is estimated from observed bids using v = min t b t directly. Regarding the upper bound v, GPV proposed an estimator of the form v = max t v t. With this in mind, we now turn to an investigation of performance of the estimators in finite samples, where boundary correction is most valuable. 4 Finite-Sample Comparisons Given the slow rates of convergence for two-step nonparametric estimators, the principal gains to boundary correction are in finite sample performance. This is especially true in the empirical auctions literature which has long been plagued by small sample sizes, and the necessity for the researcher to bin the data, or partition them by factors such as number and configuration of bidders. In this section, we present a Monte Carlo study that compares the classic GPV estimator with our proposed BCGPV alternative along a number of dimensions. Doing so allows us to highlight the Note that GPV argued from [their] Proposition 3-(ii) and a proof analogous to that of [their] Proposition, it can be readily shown that the support... can be estimated uniformly. This still applies to BCGPV, but with finite-sample improvement in the performance of the proposed estimator due to boundary correction. 3

15 DENSITIES (A) Probability Density Functions V f (v) f (v) f 3 (v) f 4 (v) BID FUNCTIONS β (v) β (v) β 3 (v) β 4 (v) 45 -LINE V (B) Equilibrium Bids Figure : Monte Carlo Study Examples various implications of sample trimming discussed in Section 3.. that are often overlooked by applied researchers. Boundary correction provides a vast improvement within sample sizes typical of those seen in practice. Moreover, these problems remain relevant for much large sample sizes than those we consider due to the slow rate at which optimal bandwidths approach zero. 4. Monte Carlo Procedure We consider four different private value distributions over the support [v, v] = [, ]. Specifically, we use mixtures of beta distributions of the form F (v; θ, γ) = γ Uniform(, ) + ( γ) Beta(θ, θ ), where γ is a weighting parameter. Mixing with a uniform ensures the densities are strictly bounded away from zero over the entire support. We considered the following distributions: F (v;,,.5), F (v;,,.5), F 3 (v;, 3,.5), and F 4 (v;, 4,.5). Thus, γ =.5 is small to provide positive mass at the boundaries, but preserve the features of the original beta distribution. The densities are depicted in Figure A: we consider one symmetric distribution (F ) and three skewed distributions (F, F 3, F 4 ); two have interior modes, (F and F 4 ), and the others have modes at opposite extremes (F and F 3 ). As such, we argue that the results do not depend on the shape of the valuation distribution. The equilibrium bid functions implied by each distribution are depicted in Figure B. We simulate a scenario in which the econometrician observes all bids for L = first-price auctions each involving N = 5 bidders, implying a sample size of T = bids. Our procedure went as follows:. T independent private values were drawn from F j, j {,, 3, 4}, and mapped into bids via the symmetric equilibrium bid function as characterized by Riley and Samuelson [98].. Given T bids, the private value density was estimated using a triweiight kernel in three ways: 4

16 (i) via the standard GPV approach with sample trimming, and using bandwidth-selection rule (9) in both first- and second-stage estimation; (ii) via the standard GPV approach with trimming, but in which we artificially reduce the first-stage bandwidth relative to (9), yet adopt this criterion in the second-stage; (iii) via the BCGPV approach, with bandwidth-selection rule (9) in both stages. 3. Repeat the process S = times for each distribution F j providing S nonparametric estimates of the private value density for each of the three strategies described. Estimation strategies (i) and (ii) both involve the classic two-step GPV procedure with trimming, and both employ bandwidth formulas that are O(T /5 ) as required by GPV Assumption A4 (ii) for convergence at the nonparametric optimal rate. However, optimality in the sense of minimal MISE for a fixed sample size is another matter. Strategies (i) and (ii) differ by whether they adhere to bandwidth rule (9) in stage one. This rule was derived by Silverman [986] to minimize MISE of KDEs, and various values of the kernel-specific constants c κ were tabulated by Härdle [99]. As suggested by GPV and Li et al. [], strategy (i) is typically implemented in applied work, and the reason why is simple: this method minimizes MISE of fb on the relevant interior set, resulting in (interior) private value estimates with maximal statistical precision. As long as the sample size is large enough so that (Ω s complement) is small, this would be appropriate. Otherwise, an important trade-off arises as discussed above: (9) does not account for the effect of h b on the size of, so the ultimate goal to minimize second-stage MISE over the entire private value support implies an optimal first-stage bandwidth strictly below (9). Unfortunately, as noted by Athey and Haile [7], data driven bandwidth selection procedures have not been explored in the literature on two-stage nonparametric estimators. Fortunately, the BCGPV estimator lays this concern to rest by eliminating externalities imposed on the second stage by first-stage bandwidth selection. The primary comparison of interest is between strategies (i) and (iii). However, to illustrate the first-stage bandwidth trade-off under sample trimming, we also consider in strategy (ii) a simple, and admittedly arbitrary, alternative in which we replace the triweight constant c Triweight with the Gaussian constant c Gaussian = c Triweight /.978 when estimating h b. To be clear, we recognize that this arbitrary approach has no theoretical foundation; it does not minimize MISE on Ω and would not be optimal ( ) ( ) Note that the optimal nonparametric bandwidths h b = λ b log T R+3 and h T v = λ v log T R+3, due to GPV, T depict a class of bandwidths (differing by a multiplicative constant) which guarantee desirable asymptotic behavior, but do not establish a single optimal rule. If the researcher assumes only that the distribution is once continuously differentiable, or R =, then Silverman s automatic bandwidth rule (9) applies. See footnote 5 for more details. 5

17 for large samples. Although (ii) is not based on any well-defined optimality criterion, we use it to demonstrate that one could do better in practice under sample trimming, by deviating from method (i) which does not internalize stage-two consequences when choosing h b. Inclusion of (ii) provide insight for what drives the improvement gained by the BCGPV estimator. 4. Monte Carlo Results Figures 5 depict a series of plots of the true density along with the pointwise 95th, 5th, and 5th percentiles of the estimators. We quantify the information in the plots in Table. Each figure involves three panels to illustrate performance across each respective estimator. The left-most panel depicts performance using (i) which we refer to as Trim (h b, h v ) as it involves trimming and use of (9) for selection of both bandwidths h b and h v. The middle panel depicts performance using (ii) which we refer to as Trim (h bg, h v ) as it involves trimming and use of a first-stage Gaussian bandwidth and (9) in the second stage. The panel on the right depicts the BCGPV estimator (iii). In each figure we display the average region over which valid inference is possible using vertical lines. For the [ } }] two cases with trimming, they represent E [Ω s ] = E { ξs (b min s + h b s) + h v s, E { ξs (b max s h b s) h v s, bounded by the mean upper and lower endpoints of Ω s over all simulations s. Recall that outside of Ω s nonparametric inference is either hampered by boundary effects, or impossible due to missing data. Note also that the size of this set is dependent on the bandwidths which vary across different [ ] [ ] strategies considered, so we shall differentiate them by superscripts, or E and E. For the boundary correction case, the solid vertical lines represent the average minimum and maximum of the sample of pseudo values, so, abusing notation slightly, we call this region E [ ] Ω BC s. [ ] [ ] In comparing the three panels, a general pattern is that E Ω (i) s E Ω (ii) s E [ ] Ω BC s ; that is, the average region of valid inference is largest under boundary correction and smallest under Trim (h b, h v ). Because BCGPV does not require any trimming, the vertical lines are farthest apart and Ω (i) s Ω (ii) s much closer to the bounds the true support. In contrast, the extremal pseudo-values that survive trimming, ξ(b min + h b ) and ξ(b max h b ), are strictly bounded away from v and v by a function of the first-stage bandwidth. Moreover, because bandwidths are of order O ( T /5), the gap between E [ ] Ω BC s [ ] and E closes very slowly. The vertical lines alone make clear that there is a nontrivial difference Ω (i) s Within simulation s, if a point v [v, v] lay outside the range of the pseudo value sample [ v s min, v s max ] then the density value f V (v) at that point is set to zero by default. This is to stay true to best practice: an econometrician would never attempt nonparametric extrapolation, or inference at a point outside the sample. Therefore, in Figures 5, we evaluate the percentiles of the estimators at a given grid point conditional on the estimator being valid at that point. That is, in evaluating the estimator percentiles at v, we consider only simulations that span the grid point v. The percentiles of the density estimator all drop to zero when a given point is outside the respective Ω s of all simulations. 6

18 DENSITY V (A) Trim (h b, h v ) DENSITY V (B) Trim (h bg, h v ) DENSITY V (C) Boundary correction Figure : Performance of the estimators for Example DENSITY.5 DENSITY.5 DENSITY V (A) Trim (h b, h v ) V (B) Trim (h bg, h v ) V (C) Boundary correction Figure 3: Performance of the estimators for Example DENSITY.5.5 DENSITY.5.5 DENSITY V (A) Trim (h b, h v ) V (B) Trim (h bg, h v ) V (C) Boundary correction Figure 4: Performance of the estimators for Example 3 7

19 DENSITY.5 DENSITY.5 DENSITY V (A) Trim (h b, h v ) V (B) Trim (h bg, h v ) V (C) Boundary correction Figure 5: Performance of the estimators for Example 4 in what can be inferred from bid data to which the three methods were applied. Figures 3 and 4 also show how sample trimming may lead the researcher to infer a distinctive interior mode in the density when there is none; BCGPV on the other hand addresses this problem substantially. There are other salient features of Figures 5 worth noting. In general, two-stage nonparametric estimators (including BCGPV) have a harder time pinning down private value densities at the upper bound of the support where bid functions display more curvature than at the lower end; see Figure (B). Moreover, in (8) the numerator is close to zero at the lower end, but close to one at the upper end, making the estimator more sensitive to sampling variability near the upper end of the support. 3 Table highlights the importance of the effects discussed in Section 3... Columns three through five help measure the magnitude of the data loss problem: the third reports the average number of observations that get trimmed, and the next two columns represent the bias in the endpoint estimators ( v, v), or the average difference between the true private-value endpoints and the extremal pseudo values that survive to the second stage; namely, ( ξ(b min + h b ), ξ(b max h )) b under trimming and ( ξ(b min ), ξ(b )) max under boundary correction. While these columns depend only on h b, column six, mean coverage mass, captures the compounding of data loss and boundary effects across both stages of estimation: it reports the mass under the true distribution over the region E [Ω s ], where estimation is not plagued by boundary effects or missing data. The final three columns contain a familiar measure of statistical performance, the root-mise. Column seven reports first-stage 3 The 95 th percentile lines for BCGPV all display blips near the upper end, though this phenomenon is less apparent in the medians or 5 th percentile lines. The blips are due to a combination of curvature in the bid functions and a larger MISE for the KZ one-stage estimator near the boundaries. Recall from Section that boundary correction delivers uniform consistency, but there is still larger variance and bias near the endpoints. These terms are of the same order on the entire support, but they may differ by a multiplicative constant depending on proximity to the endpoints. 8

20 Table : Comparison of Nonparametric Estimators: Trimming vs. Boundary Correction Ex Ex Ex 3 Ex 4 Mean Mean Estimation Obs. Bias Coverage MISE Strategy Trimmed v v Mass [b + h b, b h b ] Ω (i) [v, v] Trim (h b, h v ) Trim (h bg, h v ) BC Trim (h b, h v ) Trim (h bg, h v ) BC Trim (h b, h v ) Trim (h bg, h v ) BC Trim (h b, h v ) Trim (h bg, h v ) BC MISE over a common interval on which the three estimators are comparable. We computed this ( ) { [ ] } as MISE [b + h b, b h b b h b ] = E fbs (b) f B (b) db = { [ } b h b E fbs(b) f B (b)] db, b+h b where h b S s= hb s /S denotes the sample average of first-stage bandwidths according to Silverman s rule for all simulations. Silverman s criterion implicitly assumes no boundary effects, so for compact supports it ensures minimal MISE only on this interior set. The last two columns report MISE for the valuation density over two different intervals: MISE ( { Ω (i)) [ } = E fv Ω (i) s (v) f V (v)] dv and s { [ } v MISE ([v, v]) = E fv v s (v) f V (v)] dv = { [ ] } v v E fv s (v) f V (v) dv. This last measure is the ultimate target of interest, but the previous one serves an illustrative purpose discussed below. 4 Column three shows how profound the data loss problem can be under sample trimming when one employs established bandwidth selection rules. Across all four examples, between 5% and 43% b+h b of the data are trimmed on average. By reducing the first-stage bandwidth, one can mitigate the problem: our arbitrary choice of h bg results in between 5% and 5% trimming instead. Trimmed data correspond to extreme bids generating biases in the estimated endpoints of the support. Of course, with boundary correction there is no data loss, leading to substantial improvements in all examples. Mean coverage mass is a related measure but it also picks up some of the influence of the secondstage bandwidth. It represents the mass under the true density between the vertical black lines in Figures 5, which is the average region where problems of boundary effects and missing data are 4 Recall that in computing MISE for columns seven and nine, density values were set to a default value of zero to avoid nonparametric extrapolation. Occasionally, a simulation produced an upper bound estimate strictly above v. In such cases, for the final MISE measure we maintained the upper limit of integration at v, thereby naturally penalizing this type of error on the part of the estimator. 9

A Shape Constrained Estimator of Bidding Function of First-Price Sealed-Bid Auctions

A Shape Constrained Estimator of Bidding Function of First-Price Sealed-Bid Auctions A Shape Constrained Estimator of Bidding Function of First-Price Sealed-Bid Auctions Yu Yvette Zhang Abstract This paper is concerned with economic analysis of first-price sealed-bid auctions with risk

More information

Identification and Estimation of Bidders Risk Aversion in. First-Price Auctions

Identification and Estimation of Bidders Risk Aversion in. First-Price Auctions Identification and Estimation of Bidders Risk Aversion in First-Price Auctions Isabelle Perrigne Pennsylvania State University Department of Economics University Park, PA 16802 Phone: (814) 863-2157, Fax:

More information

Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions

Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions Vadim Marmer University of British Columbia Artyom Shneyerov CIRANO, CIREQ, and Concordia University August 30, 2010 Abstract

More information

Characterization of equilibrium in pay-as-bid auctions for multiple units

Characterization of equilibrium in pay-as-bid auctions for multiple units Economic Theory (2006) 29: 197 211 DOI 10.1007/s00199-005-0009-y RESEARCH ARTICLE Indranil Chakraborty Characterization of equilibrium in pay-as-bid auctions for multiple units Received: 26 April 2004

More information

Quantile-Based Nonparametric Inference for First-Price Auctions

Quantile-Based Nonparametric Inference for First-Price Auctions Quantile-Based Nonparametric Inference for First-Price Auctions Vadim Marmer University of British Columbia Artyom Shneyerov CIREQ, CIRANO, and Concordia University, Montreal May 16, 2013 Abstract We propose

More information

Chapter 9. Non-Parametric Density Function Estimation

Chapter 9. Non-Parametric Density Function Estimation 9-1 Density Estimation Version 1.2 Chapter 9 Non-Parametric Density Function Estimation 9.1. Introduction We have discussed several estimation techniques: method of moments, maximum likelihood, and least

More information

Chapter 9. Non-Parametric Density Function Estimation

Chapter 9. Non-Parametric Density Function Estimation 9-1 Density Estimation Version 1.1 Chapter 9 Non-Parametric Density Function Estimation 9.1. Introduction We have discussed several estimation techniques: method of moments, maximum likelihood, and least

More information

Working Paper EQUILIBRIUM SELECTION IN COMMON-VALUE SECOND-PRICE AUCTIONS. Heng Liu

Working Paper EQUILIBRIUM SELECTION IN COMMON-VALUE SECOND-PRICE AUCTIONS. Heng Liu Working Paper EQUILIBRIUM SELECTION IN COMMON-VALUE SECOND-PRICE AUCTIONS Heng Liu This note considers equilibrium selection in common-value secondprice auctions with two bidders. We show that for each

More information

Identification and Estimation of Auction Models with Unobserved Heterogeneity

Identification and Estimation of Auction Models with Unobserved Heterogeneity Review of Economic Studies 2011 78, 293 327 The Author 2011. Published by Oxford University Press on behalf of The Review of Economic Studies Limited. doi: 10.1093/restud/rdq004 Identification and Estimation

More information

Online Appendix for Dynamic Ex Post Equilibrium, Welfare, and Optimal Trading Frequency in Double Auctions

Online Appendix for Dynamic Ex Post Equilibrium, Welfare, and Optimal Trading Frequency in Double Auctions Online Appendix for Dynamic Ex Post Equilibrium, Welfare, and Optimal Trading Frequency in Double Auctions Songzi Du Haoxiang Zhu September 2013 This document supplements Du and Zhu (2013. All results

More information

Identification and estimation in first-price auctions with risk-averse bidders and selective entry

Identification and estimation in first-price auctions with risk-averse bidders and selective entry Identification and estimation in first-price auctions with risk-averse bidders and selective entry Matthew Gentry Tong Li Jingfeng Lu The Institute for Fiscal Studies Department of Economics, UCL cemmap

More information

NONPARAMETRIC ESTIMATION OF DUTCH AND FIRST-PRICE, SEALED-BID AUCTION MODELS WITH ASYMMETRIC BIDDERS

NONPARAMETRIC ESTIMATION OF DUTCH AND FIRST-PRICE, SEALED-BID AUCTION MODELS WITH ASYMMETRIC BIDDERS NONPARAMETRIC ESTIMATION OF DUTCH AND FIRST-PRICE, SEALED-BID AUCTION MODELS WITH ASYMMETRIC BIDDERS Bjarne Brendstrup Department of Economics University of Aarhus Harry J. Paarsch Department of Economics

More information

NONPARAMETRIC IDENTIFICATION AND SEMI-NONPARAMETRIC ESTIMATION OF FIRST-PRICE AUCTIONS

NONPARAMETRIC IDENTIFICATION AND SEMI-NONPARAMETRIC ESTIMATION OF FIRST-PRICE AUCTIONS The Pennsylvania State University The Graduate School Department of Economics NONPARAMETRIC IDENTIFICATION AND SEMI-NONPARAMETRIC ESTIMATION OF FIRST-PRICE AUCTIONS A Thesis in Economics by Hosin Song

More information

Lecture 6: Communication Complexity of Auctions

Lecture 6: Communication Complexity of Auctions Algorithmic Game Theory October 13, 2008 Lecture 6: Communication Complexity of Auctions Lecturer: Sébastien Lahaie Scribe: Rajat Dixit, Sébastien Lahaie In this lecture we examine the amount of communication

More information

A Bayesian perspective on GMM and IV

A Bayesian perspective on GMM and IV A Bayesian perspective on GMM and IV Christopher A. Sims Princeton University sims@princeton.edu November 26, 2013 What is a Bayesian perspective? A Bayesian perspective on scientific reporting views all

More information

Auctions. data better than the typical data set in industrial organization. auction game is relatively simple, well-specified rules.

Auctions. data better than the typical data set in industrial organization. auction game is relatively simple, well-specified rules. Auctions Introduction of Hendricks and Porter. subject, they argue To sell interest in the auctions are prevalent data better than the typical data set in industrial organization auction game is relatively

More information

The Revenue Equivalence Theorem 1

The Revenue Equivalence Theorem 1 John Nachbar Washington University May 2, 2017 The Revenue Equivalence Theorem 1 1 Introduction. The Revenue Equivalence Theorem gives conditions under which some very different auctions generate the same

More information

Online Appendix for Asymptotically Optimal Prior-Free Clock Auctions

Online Appendix for Asymptotically Optimal Prior-Free Clock Auctions Online Appendix for Asymptotically Optimal Prior-Free Clock Auctions Simon Loertscher and Leslie M. Marx January 16, 2019 Online Appendix: Alternative clock auctions In this appendix we describe two alternative

More information

Rose-Hulman Undergraduate Mathematics Journal

Rose-Hulman Undergraduate Mathematics Journal Rose-Hulman Undergraduate Mathematics Journal Volume 17 Issue 1 Article 5 Reversing A Doodle Bryan A. Curtis Metropolitan State University of Denver Follow this and additional works at: http://scholar.rose-hulman.edu/rhumj

More information

A New Class of Non Existence Examples for the Moral Hazard Problem

A New Class of Non Existence Examples for the Moral Hazard Problem A New Class of Non Existence Examples for the Moral Hazard Problem Sofia Moroni and Jeroen Swinkels April, 23 Abstract We provide a class of counter-examples to existence in a simple moral hazard problem

More information

Identification and Testing in Ascending Auctions with Unobserved Heterogeneity

Identification and Testing in Ascending Auctions with Unobserved Heterogeneity Identification and Testing in Ascending Auctions with Unobserved Heterogeneity Andrés Aradillas-ópez, Amit Gandhi, Daniel Quint July 22, 2010 Abstract This paper empirically studies the consequences of

More information

Game Theory. Monika Köppl-Turyna. Winter 2017/2018. Institute for Analytical Economics Vienna University of Economics and Business

Game Theory. Monika Köppl-Turyna. Winter 2017/2018. Institute for Analytical Economics Vienna University of Economics and Business Monika Köppl-Turyna Institute for Analytical Economics Vienna University of Economics and Business Winter 2017/2018 Static Games of Incomplete Information Introduction So far we assumed that payoff functions

More information

Estimating Risk Aversion from Ascending and Sealed-Bid Auctions: The Case of Timber Auction Data

Estimating Risk Aversion from Ascending and Sealed-Bid Auctions: The Case of Timber Auction Data Estimating Risk Aversion from Ascending and Sealed-Bid Auctions: The Case of Timber Auction Data Jingfeng Lu National University of Singapore Isabelle Perrigne Pennsylvania State University September 2006

More information

Deceptive Advertising with Rational Buyers

Deceptive Advertising with Rational Buyers Deceptive Advertising with Rational Buyers September 6, 016 ONLINE APPENDIX In this Appendix we present in full additional results and extensions which are only mentioned in the paper. In the exposition

More information

An Ascending Auction with Multidimensional Signals

An Ascending Auction with Multidimensional Signals An Ascending Auction with Multidimensional Signals Tibor Heumann May 11, 2017 Abstract A single-item ascending auction in which agents observe multidimensional Gaussian signals about their valuation of

More information

Boundary Correction Methods in Kernel Density Estimation Tom Alberts C o u(r)a n (t) Institute joint work with R.J. Karunamuni University of Alberta

Boundary Correction Methods in Kernel Density Estimation Tom Alberts C o u(r)a n (t) Institute joint work with R.J. Karunamuni University of Alberta Boundary Correction Methods in Kernel Density Estimation Tom Alberts C o u(r)a n (t) Institute joint work with R.J. Karunamuni University of Alberta November 29, 2007 Outline Overview of Kernel Density

More information

Columbia University. Department of Economics Discussion Paper Series. Caps on Political Lobbying: Reply. Yeon-Koo Che Ian Gale

Columbia University. Department of Economics Discussion Paper Series. Caps on Political Lobbying: Reply. Yeon-Koo Che Ian Gale Columbia University Department of Economics Discussion Paper Series Caps on Political Lobbying: Reply Yeon-Koo Che Ian Gale Discussion Paper No.: 0506-15 Department of Economics Columbia University New

More information

Working Paper EQUILIBRIUM SELECTION IN COMMON-VALUE SECOND-PRICE AUCTIONS. Heng Liu

Working Paper EQUILIBRIUM SELECTION IN COMMON-VALUE SECOND-PRICE AUCTIONS. Heng Liu Working Paper EQUILIBRIUM SELECTION IN COMMON-VALUE SECOND-PRICE AUCTIONS Heng Liu This paper considers the problem of equilibrium selection in a commonvalue second-price auction with two bidders. We show

More information

Statistics: Learning models from data

Statistics: Learning models from data DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial

More information

ESTIMATING UNOBSERVED INDIVIDUAL HETEROGENEITY THROUGH PAIRWISE COMPARISONS

ESTIMATING UNOBSERVED INDIVIDUAL HETEROGENEITY THROUGH PAIRWISE COMPARISONS ESTIMATING UNOBSERVED INDIVIDUAL HETEROGENEITY THROUGH PAIRWISE COMPARISONS ELENA KRASNOKUTSKAYA, KYUNGCHUL SONG, AND XUN TANG Abstract. This paper proposes a new method to study environments with unobserved

More information

Time Series and Forecasting Lecture 4 NonLinear Time Series

Time Series and Forecasting Lecture 4 NonLinear Time Series Time Series and Forecasting Lecture 4 NonLinear Time Series Bruce E. Hansen Summer School in Economics and Econometrics University of Crete July 23-27, 2012 Bruce Hansen (University of Wisconsin) Foundations

More information

Nonparametric Identification of Risk Aversion in First-Price Auctions Under Exclusion Restrictions*

Nonparametric Identification of Risk Aversion in First-Price Auctions Under Exclusion Restrictions* Nonparametric Identification of Risk Aversion in First-Price Auctions Under Exclusion Restrictions* Emmanuel Guerre Queen Mary University of London Isabelle Perrigne Pennsylvania State University Quang

More information

Online Appendix for "Auctions in Markets: Common Outside Options and the Continuation Value Effect" Not intended for publication

Online Appendix for Auctions in Markets: Common Outside Options and the Continuation Value Effect Not intended for publication Online Appendix for "Auctions in Markets: Common Outside Options and the Continuation Value Effect" Not intended for publication Stephan Lauermann Gabor Virag March 19, 2012 1 First-price and second-price

More information

Symmetric Separating Equilibria in English Auctions 1

Symmetric Separating Equilibria in English Auctions 1 Games and Economic Behavior 38, 19 27 22 doi:116 game21879, available online at http: wwwidealibrarycom on Symmetric Separating Equilibria in English Auctions 1 Sushil Bihchandani 2 Anderson Graduate School

More information

INTRODUCTION TO PATTERN RECOGNITION

INTRODUCTION TO PATTERN RECOGNITION INTRODUCTION TO PATTERN RECOGNITION INSTRUCTOR: WEI DING 1 Pattern Recognition Automatic discovery of regularities in data through the use of computer algorithms With the use of these regularities to take

More information

Supplemental Online Appendix for Trading Across Borders in On-line Auctions

Supplemental Online Appendix for Trading Across Borders in On-line Auctions Supplemental Online Appendix for Trading Across Borders in On-line Auctions Elena Krasnokutskaya Christian Terwiesch Johns Hopkins University Wharton School of Business Lucia Tiererova Johns Hopkins University

More information

Matching with Contracts: The Critical Role of Irrelevance of Rejected Contracts

Matching with Contracts: The Critical Role of Irrelevance of Rejected Contracts Matching with Contracts: The Critical Role of Irrelevance of Rejected Contracts Orhan Aygün and Tayfun Sönmez May 2012 Abstract We show that an ambiguity in setting the primitives of the matching with

More information

Identification of Time and Risk Preferences in Buy Price Auctions

Identification of Time and Risk Preferences in Buy Price Auctions Identification of Time and Risk Preferences in Buy Price Auctions Daniel Ackerberg 1 Keisuke Hirano 2 Quazi Shahriar 3 1 University of Michigan 2 University of Arizona 3 San Diego State University April

More information

EconS Microeconomic Theory II Midterm Exam #2 - Answer Key

EconS Microeconomic Theory II Midterm Exam #2 - Answer Key EconS 50 - Microeconomic Theory II Midterm Exam # - Answer Key 1. Revenue comparison in two auction formats. Consider a sealed-bid auction with bidders. Every bidder i privately observes his valuation

More information

Flexible Estimation of Treatment Effect Parameters

Flexible Estimation of Treatment Effect Parameters Flexible Estimation of Treatment Effect Parameters Thomas MaCurdy a and Xiaohong Chen b and Han Hong c Introduction Many empirical studies of program evaluations are complicated by the presence of both

More information

(1) Sort all time observations from least to greatest, so that the j th and (j + 1) st observations are ordered by t j t j+1 for all j = 1,..., J.

(1) Sort all time observations from least to greatest, so that the j th and (j + 1) st observations are ordered by t j t j+1 for all j = 1,..., J. AFFIRMATIVE ACTION AND HUMAN CAPITAL INVESTMENT 8. ONLINE APPENDIX TO ACCOMPANY Affirmative Action and Human Capital Investment: Theory and Evidence from a Randomized Field Experiment, by CHRISTOPHER COTTON,

More information

Chapter 2: Resampling Maarten Jansen

Chapter 2: Resampling Maarten Jansen Chapter 2: Resampling Maarten Jansen Randomization tests Randomized experiment random assignment of sample subjects to groups Example: medical experiment with control group n 1 subjects for true medicine,

More information

Definitions and Proofs

Definitions and Proofs Giving Advice vs. Making Decisions: Transparency, Information, and Delegation Online Appendix A Definitions and Proofs A. The Informational Environment The set of states of nature is denoted by = [, ],

More information

Approximately Optimal Auctions

Approximately Optimal Auctions Approximately Optimal Auctions Professor Greenwald 17--1 We show that simple auctions can generate near-optimal revenue by using non-optimal reserve prices. 1 Simple vs. Optimal Auctions While Myerson

More information

September Math Course: First Order Derivative

September Math Course: First Order Derivative September Math Course: First Order Derivative Arina Nikandrova Functions Function y = f (x), where x is either be a scalar or a vector of several variables (x,..., x n ), can be thought of as a rule which

More information

Non-Existence of Equilibrium in Vickrey, Second-Price, and English Auctions

Non-Existence of Equilibrium in Vickrey, Second-Price, and English Auctions Non-Existence of Equilibrium in Vickrey, Second-Price, and English Auctions Matthew O. Jackson September 21, 2005 Forthcoming: Review of Economic Design Abstract A simple example shows that equilibria

More information

Nonparametric Identification and Estimation of a Transformation Model

Nonparametric Identification and Estimation of a Transformation Model Nonparametric and of a Transformation Model Hidehiko Ichimura and Sokbae Lee University of Tokyo and Seoul National University 15 February, 2012 Outline 1. The Model and Motivation 2. 3. Consistency 4.

More information

CS364B: Frontiers in Mechanism Design Lecture #3: The Crawford-Knoer Auction

CS364B: Frontiers in Mechanism Design Lecture #3: The Crawford-Knoer Auction CS364B: Frontiers in Mechanism Design Lecture #3: The Crawford-Knoer Auction Tim Roughgarden January 15, 2014 1 The Story So Far Our current theme is the design of ex post incentive compatible (EPIC) ascending

More information

Nonparametric Testing and Identification in Ascending Auctions with Unobserved Heterogeneity

Nonparametric Testing and Identification in Ascending Auctions with Unobserved Heterogeneity Nonparametric Testing and Identification in Ascending Auctions with Unobserved Heterogeneity Andrés Aradillas-ópez, Amit Gandhi, Daniel Quint June 3, 2010 Abstract When unobserved heterogeneity across

More information

A MECHANISM DESIGN APPROACH TO IDENTIFICATION AND ESTIMATION

A MECHANISM DESIGN APPROACH TO IDENTIFICATION AND ESTIMATION A MECHANISM DESIGN APPROACH TO IDENTIFICATION AND ESTIMATION BRADLEY LARSEN AND ANTHONY LEE ZHANG Abstract. This paper presents a two-step identification argument for a large class of quasilinear utility

More information

Lecture 4. 1 Examples of Mechanism Design Problems

Lecture 4. 1 Examples of Mechanism Design Problems CSCI699: Topics in Learning and Game Theory Lecture 4 Lecturer: Shaddin Dughmi Scribes: Haifeng Xu,Reem Alfayez 1 Examples of Mechanism Design Problems Example 1: Single Item Auctions. There is a single

More information

Quantile-Based Nonparametric Inference for First-Price Auctions

Quantile-Based Nonparametric Inference for First-Price Auctions Quantile-Based Nonparametric Inference for First-Price Auctions Vadim Marmer University of British Columbia Artyom Shneyerov y CIREQ, CIRANO, and Concordia University, Montreal August 30, 200 Abstract

More information

CS599: Algorithm Design in Strategic Settings Fall 2012 Lecture 11: Ironing and Approximate Mechanism Design in Single-Parameter Bayesian Settings

CS599: Algorithm Design in Strategic Settings Fall 2012 Lecture 11: Ironing and Approximate Mechanism Design in Single-Parameter Bayesian Settings CS599: Algorithm Design in Strategic Settings Fall 2012 Lecture 11: Ironing and Approximate Mechanism Design in Single-Parameter Bayesian Settings Instructor: Shaddin Dughmi Outline 1 Recap 2 Non-regular

More information

University of Warwick, Department of Economics Spring Final Exam. Answer TWO questions. All questions carry equal weight. Time allowed 2 hours.

University of Warwick, Department of Economics Spring Final Exam. Answer TWO questions. All questions carry equal weight. Time allowed 2 hours. University of Warwick, Department of Economics Spring 2012 EC941: Game Theory Prof. Francesco Squintani Final Exam Answer TWO questions. All questions carry equal weight. Time allowed 2 hours. 1. Consider

More information

Sealed-bid first-price auctions with an unknown number of bidders

Sealed-bid first-price auctions with an unknown number of bidders Sealed-bid first-price auctions with an unknown number of bidders Erik Ekström Department of Mathematics, Uppsala University Carl Lindberg The Second Swedish National Pension Fund e-mail: ekstrom@math.uu.se,

More information

Why is the field of statistics still an active one?

Why is the field of statistics still an active one? Why is the field of statistics still an active one? It s obvious that one needs statistics: to describe experimental data in a compact way, to compare datasets, to ask whether data are consistent with

More information

A note on multivariate Gauss-Hermite quadrature

A note on multivariate Gauss-Hermite quadrature A note on multivariate Gauss-Hermite quadrature Peter Jäckel 1th May 5 1 Introduction Gaussian quadratures are an ingenious way to approximate the integral of an unknown function f(x) over a specified

More information

A Novel Nonparametric Density Estimator

A Novel Nonparametric Density Estimator A Novel Nonparametric Density Estimator Z. I. Botev The University of Queensland Australia Abstract We present a novel nonparametric density estimator and a new data-driven bandwidth selection method with

More information

Bayesian Semiparametric GARCH Models

Bayesian Semiparametric GARCH Models Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods

More information

Quantile-Based Nonparametric Inference for First-Price Auctions

Quantile-Based Nonparametric Inference for First-Price Auctions MPRA Munich Personal RePEc Archive Quantile-Based Nonparametric Inference for First-Price Auctions Vadim Marmer and Artyom Shneyerov University of British Columbia October 2006 Online at http://mpra.ub.uni-muenchen.de/5899/

More information

SEQUENTIAL ESTIMATION OF DYNAMIC DISCRETE GAMES: A COMMENT. MARTIN PESENDORFER London School of Economics, WC2A 2AE London, U.K.

SEQUENTIAL ESTIMATION OF DYNAMIC DISCRETE GAMES: A COMMENT. MARTIN PESENDORFER London School of Economics, WC2A 2AE London, U.K. http://www.econometricsociety.org/ Econometrica, Vol. 78, No. 2 (arch, 2010), 833 842 SEQUENTIAL ESTIATION OF DYNAIC DISCRETE GAES: A COENT ARTIN PESENDORFER London School of Economics, WC2A 2AE London,

More information

Bayesian Semiparametric GARCH Models

Bayesian Semiparametric GARCH Models Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods

More information

Some forgotten equilibria of the Bertrand duopoly!?

Some forgotten equilibria of the Bertrand duopoly!? Some forgotten equilibria of the Bertrand duopoly!? Mathias Erlei Clausthal University of Technology Julius-Albert-Str. 2, 38678 Clausthal-Zellerfeld, Germany email: m.erlei@tu-clausthal.de Abstract This

More information

Chapter 2. Equilibrium. 2.1 Complete Information Games

Chapter 2. Equilibrium. 2.1 Complete Information Games Chapter 2 Equilibrium Equilibrium attempts to capture what happens in a game when players behave strategically. This is a central concept to these notes as in mechanism design we are optimizing over games

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Checking Consistency. Chapter Introduction Support of a Consistent Family

Checking Consistency. Chapter Introduction Support of a Consistent Family Chapter 11 Checking Consistency 11.1 Introduction The conditions which define a consistent family of histories were stated in Ch. 10. The sample space must consist of a collection of mutually orthogonal

More information

Lecture Notes 1: Decisions and Data. In these notes, I describe some basic ideas in decision theory. theory is constructed from

Lecture Notes 1: Decisions and Data. In these notes, I describe some basic ideas in decision theory. theory is constructed from Topics in Data Analysis Steven N. Durlauf University of Wisconsin Lecture Notes : Decisions and Data In these notes, I describe some basic ideas in decision theory. theory is constructed from The Data:

More information

Interdependent Value Auctions with an Insider Bidder 1

Interdependent Value Auctions with an Insider Bidder 1 Interdependent Value Auctions with an Insider Bidder Jinwoo Kim We study the efficiency of standard auctions with interdependent values in which one of two bidders is perfectly informed of his value while

More information

Consistency and Asymptotic Normality for Equilibrium Models with Partially Observed Outcome Variables

Consistency and Asymptotic Normality for Equilibrium Models with Partially Observed Outcome Variables Consistency and Asymptotic Normality for Equilibrium Models with Partially Observed Outcome Variables Nathan H. Miller Georgetown University Matthew Osborne University of Toronto November 25, 2013 Abstract

More information

Reproducing Kernel Hilbert Spaces

Reproducing Kernel Hilbert Spaces 9.520: Statistical Learning Theory and Applications February 10th, 2010 Reproducing Kernel Hilbert Spaces Lecturer: Lorenzo Rosasco Scribe: Greg Durrett 1 Introduction In the previous two lectures, we

More information

Supplementary appendix to the paper Hierarchical cheap talk Not for publication

Supplementary appendix to the paper Hierarchical cheap talk Not for publication Supplementary appendix to the paper Hierarchical cheap talk Not for publication Attila Ambrus, Eduardo M. Azevedo, and Yuichiro Kamada December 3, 011 1 Monotonicity of the set of pure-strategy equilibria

More information

Lecture 6 Games with Incomplete Information. November 14, 2008

Lecture 6 Games with Incomplete Information. November 14, 2008 Lecture 6 Games with Incomplete Information November 14, 2008 Bayesian Games : Osborne, ch 9 Battle of the sexes with incomplete information Player 1 would like to match player 2's action Player 1 is unsure

More information

Principles Underlying Evaluation Estimators

Principles Underlying Evaluation Estimators The Principles Underlying Evaluation Estimators James J. University of Chicago Econ 350, Winter 2019 The Basic Principles Underlying the Identification of the Main Econometric Evaluation Estimators Two

More information

6 Pattern Mixture Models

6 Pattern Mixture Models 6 Pattern Mixture Models A common theme underlying the methods we have discussed so far is that interest focuses on making inference on parameters in a parametric or semiparametric model for the full data

More information

Using Economic Theory to Guide Numerical Analysis: Solving for Equilibria in Models of Asymmetric First-Price Auctions

Using Economic Theory to Guide Numerical Analysis: Solving for Equilibria in Models of Asymmetric First-Price Auctions Using Economic Theory to Guide Numerical Analysis: Solving for Equilibria in Models of Asymmetric First-Price Auctions Timothy P. Hubbard Department of Economics Texas Tech University Timothy.Hubbard@ttu.edu

More information

Rent-seeking with Non-Identical Sharing Rules: An Equilibrium Rescued

Rent-seeking with Non-Identical Sharing Rules: An Equilibrium Rescued Rent-seeking with Non-Identical Sharing Rules: An Equilibrium Rescued by Douglas D. Davis and Robert J. Reilly * September 1997 Abstract Nitzan s (1991) analysis of differential sharing rules in a collective

More information

THEORIES ON AUCTIONS WITH PARTICIPATION COSTS. A Dissertation XIAOYONG CAO

THEORIES ON AUCTIONS WITH PARTICIPATION COSTS. A Dissertation XIAOYONG CAO THEORIES ON AUCTIONS WITH PARTICIPATION COSTS A Dissertation by XIAOYONG CAO Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree

More information

2. The Concept of Convergence: Ultrafilters and Nets

2. The Concept of Convergence: Ultrafilters and Nets 2. The Concept of Convergence: Ultrafilters and Nets NOTE: AS OF 2008, SOME OF THIS STUFF IS A BIT OUT- DATED AND HAS A FEW TYPOS. I WILL REVISE THIS MATE- RIAL SOMETIME. In this lecture we discuss two

More information

Isomorphisms between pattern classes

Isomorphisms between pattern classes Journal of Combinatorics olume 0, Number 0, 1 8, 0000 Isomorphisms between pattern classes M. H. Albert, M. D. Atkinson and Anders Claesson Isomorphisms φ : A B between pattern classes are considered.

More information

UNIVERSITY OF NOTTINGHAM. Discussion Papers in Economics CONSISTENT FIRM CHOICE AND THE THEORY OF SUPPLY

UNIVERSITY OF NOTTINGHAM. Discussion Papers in Economics CONSISTENT FIRM CHOICE AND THE THEORY OF SUPPLY UNIVERSITY OF NOTTINGHAM Discussion Papers in Economics Discussion Paper No. 0/06 CONSISTENT FIRM CHOICE AND THE THEORY OF SUPPLY by Indraneel Dasgupta July 00 DP 0/06 ISSN 1360-438 UNIVERSITY OF NOTTINGHAM

More information

EFFORT INCENTIVES, ACHIEVEMENT GAPS AND AFFIRMA. AND AFFIRMATIVE ACTION POLICIES (very preliminary)

EFFORT INCENTIVES, ACHIEVEMENT GAPS AND AFFIRMA. AND AFFIRMATIVE ACTION POLICIES (very preliminary) EFFORT INCENTIVES, ACHIEVEMENT GAPS AND AFFIRMATIVE ACTION POLICIES (very preliminary) Brent Hickman Department of Economics, University of Iowa December 22, 2008 Brief Model Outline In this paper, I use

More information

THE CHOICE BETWEEN MULTIPLICATIVE AND ADDITIVE OUTPUT UNCERTAINTY

THE CHOICE BETWEEN MULTIPLICATIVE AND ADDITIVE OUTPUT UNCERTAINTY THE CHOICE BETWEEN MULTIPLICATIVE AND ADDITIVE OUTPUT UNCERTAINTY Moawia Alghalith University of St. Andrews and Ardeshir J. Dalal Northern Illinois University Abstract When modeling output uncertainty,

More information

Advanced Economic Theory Lecture 9. Bilateral Asymmetric Information. Double Auction (Chatterjee and Samuelson, 1983).

Advanced Economic Theory Lecture 9. Bilateral Asymmetric Information. Double Auction (Chatterjee and Samuelson, 1983). Leonardo Felli 6 December, 2002 Advanced Economic Theory Lecture 9 Bilateral Asymmetric Information Double Auction (Chatterjee and Samuelson, 1983). Two players, a buyer and a seller: N = {b, s}. The seller

More information

Averaging Estimators for Regressions with a Possible Structural Break

Averaging Estimators for Regressions with a Possible Structural Break Averaging Estimators for Regressions with a Possible Structural Break Bruce E. Hansen University of Wisconsin y www.ssc.wisc.edu/~bhansen September 2007 Preliminary Abstract This paper investigates selection

More information

A Robust Approach to Estimating Production Functions: Replication of the ACF procedure

A Robust Approach to Estimating Production Functions: Replication of the ACF procedure A Robust Approach to Estimating Production Functions: Replication of the ACF procedure Kyoo il Kim Michigan State University Yao Luo University of Toronto Yingjun Su IESR, Jinan University August 2018

More information

A Course in Applied Econometrics. Lecture 10. Partial Identification. Outline. 1. Introduction. 2. Example I: Missing Data

A Course in Applied Econometrics. Lecture 10. Partial Identification. Outline. 1. Introduction. 2. Example I: Missing Data Outline A Course in Applied Econometrics Lecture 10 1. Introduction 2. Example I: Missing Data Partial Identification 3. Example II: Returns to Schooling 4. Example III: Initial Conditions Problems in

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Interdependent Value Auctions with Insider Bidders

Interdependent Value Auctions with Insider Bidders Interdependent Value Auctions with Insider Bidders Jinwoo Kim November 2003 Abstract I study auctions in which there exists an asymmetry in bidders knowledge about their interdependent valuations. Bidders

More information

Nonparametric Density Estimation

Nonparametric Density Estimation Nonparametric Density Estimation Econ 690 Purdue University Justin L. Tobias (Purdue) Nonparametric Density Estimation 1 / 29 Density Estimation Suppose that you had some data, say on wages, and you wanted

More information

Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016

Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016 Microeconomics II Lecture 4: Incomplete Information Karl Wärneryd Stockholm School of Economics November 2016 1 Modelling incomplete information So far, we have studied games in which information was complete,

More information

Equilibria in Second Price Auctions with Participation Costs

Equilibria in Second Price Auctions with Participation Costs Equilibria in Second Price Auctions with Participation Costs Guofu Tan and Okan Yilankaya January 2005 Abstract We investigate equilibria of sealed-bid second price auctions with bidder participation costs

More information

Microeconomic Theory (501b) Problem Set 10. Auctions and Moral Hazard Suggested Solution: Tibor Heumann

Microeconomic Theory (501b) Problem Set 10. Auctions and Moral Hazard Suggested Solution: Tibor Heumann Dirk Bergemann Department of Economics Yale University Microeconomic Theory (50b) Problem Set 0. Auctions and Moral Hazard Suggested Solution: Tibor Heumann 4/5/4 This problem set is due on Tuesday, 4//4..

More information

Issues on quantile autoregression

Issues on quantile autoregression Issues on quantile autoregression Jianqing Fan and Yingying Fan We congratulate Koenker and Xiao on their interesting and important contribution to the quantile autoregression (QAR). The paper provides

More information

Nonparametric Identification of a Binary Random Factor in Cross Section Data - Supplemental Appendix

Nonparametric Identification of a Binary Random Factor in Cross Section Data - Supplemental Appendix Nonparametric Identification of a Binary Random Factor in Cross Section Data - Supplemental Appendix Yingying Dong and Arthur Lewbel California State University Fullerton and Boston College July 2010 Abstract

More information

Mechanism Design: Basic Concepts

Mechanism Design: Basic Concepts Advanced Microeconomic Theory: Economics 521b Spring 2011 Juuso Välimäki Mechanism Design: Basic Concepts The setup is similar to that of a Bayesian game. The ingredients are: 1. Set of players, i {1,

More information

Payment Rules for Combinatorial Auctions via Structural Support Vector Machines

Payment Rules for Combinatorial Auctions via Structural Support Vector Machines Payment Rules for Combinatorial Auctions via Structural Support Vector Machines Felix Fischer Harvard University joint work with Paul Dütting (EPFL), Petch Jirapinyo (Harvard), John Lai (Harvard), Ben

More information

Strategies under Strategic Uncertainty

Strategies under Strategic Uncertainty Discussion Paper No. 18-055 Strategies under Strategic Uncertainty Helene Mass Discussion Paper No. 18-055 Strategies under Strategic Uncertainty Helene Mass Download this ZEW Discussion Paper from our

More information

Some Background Material

Some Background Material Chapter 1 Some Background Material In the first chapter, we present a quick review of elementary - but important - material as a way of dipping our toes in the water. This chapter also introduces important

More information

Checking up on the neighbors: Quantifying uncertainty in relative event location

Checking up on the neighbors: Quantifying uncertainty in relative event location Checking up on the neighbors: Quantifying uncertainty in relative event location The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Lecture 10: Mechanism Design

Lecture 10: Mechanism Design Computational Game Theory Spring Semester, 2009/10 Lecture 10: Mechanism Design Lecturer: Yishay Mansour Scribe: Vera Vsevolozhsky, Nadav Wexler 10.1 Mechanisms with money 10.1.1 Introduction As we have

More information