International Side-payments to Improve Global Public Good Provision when Transfers are Refinanced through a Tax on Local and Global Externalities
Martin Altemeyer-Bartscher, A. Markandya, Dirk T. G. Rübbelke
International Economic Journal,
Nr. 1,
2014
Abstract
This paper discusses a tax-transfer scheme that aims to address the under-provision problem associated with the private supply of international public goods and to bring about Pareto optimal allocations internationally. In particular, we consider the example of the global public good ‘climate stabilization’, both in an analytical and a numerical simulation model. The proposed scheme levies Pigouvian taxes globally, while international side-payments are employed in order to provide incentives to individual countries for not taking a free-ride from the international Pigouvian tax scheme. The side-payments, in turn, are financed via environmental taxes. As a distinctive feature, we take into account ancillary benefits that may be associated with local public characteristics of climate policy. We determine the positive impact that ancillary effects may exert on the scope for financing side-payments via environmental taxation. A particular attractive feature of ancillary benefits is that they arise shortly after the implementation of climate policies and therefore yield an almost immediate payback of investments in abatement efforts. Especially in times of high public debt levels, long periods of amortization would tend to reduce political support for investments in climate policy.
Artikel Lesen
Note on the Hidden Risk of Inflation
Makram El-Shagi, Sebastian Giesen
Journal of Economic Policy Reform,
Nr. 1,
2014
Abstract
The continued expansionary policy of the Federal Reserve gives rise to speculation whether the Fed will be able to maintain price stability in the coming decades. Most of the scientific work relating money to prices relies on broad monetary aggregates (i.e. M2 for the United States). In our paper, we argue that this view falls short. The historically unique monetary expansion has not yet fully reached M2. Using a cointegration approach, we aim to show the hidden risks for the future development of M2 and correspondingly prices. In a simulation analysis we show that even if the multiplier remains substantially below its pre-crisis level, M2 will exceed its current growth path with a probability of 95%.
Artikel Lesen
Exploring the Evolution of Innovation Networks in Science-driven and Scale-intensive Industries: New Evidence from a Stochastic Actor-based Approach
T. Buchmann, D. Hain, Muhamed Kudic, M. Müller
IWH Discussion Papers,
Nr. 1,
2014
Abstract
Our primary goal is to analyse the drivers of evolutionary network change processes by using a stochastic actor-based simulation approach. We contribute to the literature by combining two unique datasets, concerning the German laser and automotive industry, between 2002 and 2006 to explore whether geographical, network-related, and techno-logical determinants affect the evolution of networks, and if so, as to what extent these determinants systematically differ for science-driven industries compared to scale-intensive industries. Our results provide empirical evidence for the explanatory power of network-related determinants in both industries. The ‘experience effect’ as well as the ‘transitivity effects’ are significant for both industries but more pronounced for laser manufacturing firms. When it comes to ‘geographical effects’ and ‘technological ef-fects’ the picture changes considerably. While geographical proximity plays an important role in the automotive industry, firms in the laser industry seem to be less dependent on geographical closeness to cooperation partners; instead they rather search out for cooperation opportunities in distance. This might reflect the strong dependence of firms in science-driven industries to access diverse external knowledge, which cannot necessarily be found in the close geographical surrounding. Technological proximity negatively influences cooperation decisions for laser source manufacturers, yet has no impact for automotive firms. In other words, technological heterogeneity seems to ex-plain, at least in science-driven industries, the attractiveness of potential cooperation partners.
Artikel Lesen
Effects of Incorrect Specification on the Finite Sample Properties of Full and Limited Information Estimators in DSGE Models
Sebastian Giesen, Rolf Scheufele
Abstract
In this paper we analyze the small sample properties of full information and limited information estimators in a potentially misspecified DSGE model. Therefore, we conduct a simulation study based on a standard New Keynesian model including price and wage rigidities. We then study the effects of omitted variable problems on the structural parameters estimates of the model. We find that FIML performs superior when the model is correctly specified. In cases where some of the model characteristics are omitted, the performance of FIML is highly unreliable, whereas GMM estimates remain approximately unbiased and significance tests are mostly reliable.
Artikel Lesen
Money and Inflation: Consequences of the Recent Monetary Policy
Makram El-Shagi, Sebastian Giesen
Journal of Policy Modeling,
Nr. 4,
2013
Abstract
We use a multivariate state space framework to analyze the short run impact of money on prices in the United States. The key contribution of this approach is that it allows to identify the impact of money growth on inflation without having to model money demand explicitly.
Using our results, that provide evidence for a substantial impact of money on prices in the US, we analyze the consequences of the Fed's response to the financial crisis. Our results indicate a raise of US inflation above 5% for more than a decade. Alternative exit strategies that we simulate cannot fully compensate for the monetary pressure without risking serious repercussions on the real economy. Further simulations of a double dip in the United States indicate that a repetition of the unusually expansive monetary policy – in addition to increased inflation – might cause growth losses exceeding the contemporary easing of the crisis.
Artikel Lesen
Financial Factors in Macroeconometric Models
Sebastian Giesen
Volkswirtschaft, Ökonomie, Shaker Verlag GmbH, Aachen,
2013
Abstract
The important role of credit has long been identified as a key factor for economic development (see e.g. Wicksell (1898), Keynes (1931), Fisher (1933) and Minsky (1957, 1964)). Even before the financial crisis most researchers and policy makers agreed that financial frictions play an important role for business cycles and that financial turmoils can result in severe economic downturns (see e.g. Mishkin (1978), Bernanke (1981, 1983), Diamond (1984), Calomiris (1993) and Bernanke and Gertler (1995)). However, in practice researchers and policy makers mostly used simplified models for forecasting and simulation purposes. They often neglected the impact of financial frictions and emphasized other non financial market frictions when analyzing business cycle fluctuations (prominent exceptions include Kiyotaki and Moore (1997), Bernanke, Gertler, and Gilchrist (1999) and Christiano, Motto, and Rostagno (2010)). This has been due to the fact that most economic downturns did not seem to be closely related to financial market failures (see Eichenbaum (2011)). The outbreak of the subprime crises ― which caused panic in financial markets and led to the default of Lehman Brothers in September 2008 ― then led to a reconsideration of such macroeconomic frameworks (see Caballero (2010) and Trichet (2011)). To address the economic debate from a new perspective, it is therefore necessary to integrate the relevant frictions which help to explain what we have experienced during recent years.
In this thesis, I analyze different ways to incorporate relevant frictions and financial variables in macroeconometric models. I discuss the potential consequences for standard statistical inference and macroeconomic policy. I cover three different aspects in this work. Each aspect presents an idea in a self-contained unit. The following paragraphs present more detail on the main topics covered.
Artikel Lesen
Testing for Structural Breaks at Unknown Time: A Steeplechase
Makram El-Shagi, Sebastian Giesen
Computational Economics,
Nr. 1,
2013
Abstract
This paper analyzes the role of common data problems when identifying structural breaks in small samples. Most notably, we survey small sample properties of the most commonly applied endogenous break tests developed by Brown et al. (J R Stat Soc B 37:149–163, 1975) and Zeileis (Stat Pap 45(1):123–131, 2004), Nyblom (J Am Stat Assoc 84(405):223–230, 1989) and Hansen (J Policy Model 14(4):517–533, 1992), and Andrews et al. (J Econ 70(1):9–38, 1996). Power and size properties are derived using Monte Carlo simulations. We find that the Nyblom test is on par with the commonly used F type tests in a small sample in terms of power. While the Nyblom test’s power decreases if the structural break occurs close to the margin of the sample, it proves far more robust to nonnormal distributions of the error term that are found to matter strongly in small samples although being irrelevant asymptotically for all tests that are analyzed in this paper.
Artikel Lesen
Are Qualitative Inflation Expectations Useful to Predict Inflation?
Rolf Scheufele
Journal of Business Cycle Measurement and Analysis,
Nr. 1,
2011
Abstract
This paper examines the properties of qualitative inflation expectations collected from economic experts for Germany. It describes their characteristics relating to rationality and Granger causality. An out-of-sample simulation study investigates whether this indicator is suitable for inflation forecasting. Results from other standard forecasting models are considered and compared with models employing survey measures. We find that a model using survey expectations outperforms most of the competing models. Moreover, we find some evidence that the survey indicator already contains information from other model types (e. g. Phillips curve models). However, the forecast quality may be further improved by completely taking into account information from some financial indicators.
Artikel Lesen
Distance Functions for Matching in Small Samples
Eva Dettmann, Christian Schmeißer, Claudia Becker
Computational Statistics & Data Analysis,
Nr. 5,
2011
Abstract
The development of ‘standards’ for the application of matching algorithms in empirical evaluation studies is still an outstanding goal. The first step of the matching procedure is the choice of an appropriate distance function. In empirical evaluation situations often the sample sizes are small. Moreover, they consist of variables with different scale levels which have to be considered explicitly in the matching process. A simulation is performed which is directed towards these empirical challenges and supplements former studies in this respect. The choice of the analysed distance functions is determined by the results of former theoretical studies and recommendations in the empirical literature. Thus, two balancing scores (the propensity score and the index score) and the Mahalanobis distance are considered. Additionally, aggregated statistical distance functions not yet used for empirical evaluation are included. The matching outcomes are compared using non-parametric scale-specific tests for identical distributions of the characteristics in the treatment and the control groups. The simulation results show that, in small samples, aggregated statistical distance functions are the better choice for summarising similarities in differently scaled variables compared to the commonly used measures.
Artikel Lesen
Has the Euro Increased International Price Elasticities?
Oliver Holtemöller, Götz Zeddies
IWH Discussion Papers,
Nr. 18,
2010
publiziert in: Empirica
Abstract
This paper analyzes the role of common data problems when identifying structural breaks in small samples. Most notably, we survey small sample properties of the most commonly applied endogenous break tests developed by Brown, Durbin, and Evans (1975) and Zeileis (2004), Nyblom (1989) and Hansen (1992), and Andrews, Lee, and Ploberger (1996). Power and size properties are derived using Monte Carlo simulations. Results emphasize that mostly the CUSUM type tests are affected by the presence of heteroscedasticity, whereas the individual parameter Nyblom test and AvgLM test are proved to be highly robust. However, each test is significantly affected by leptokurtosis. Contrarily to other tests, where skewness is far more problematic than kurtosis, it has no additional effect for any of the endogenous break tests we analyze. Concerning overall robustness the Nyblom test performs best, while being almost on par to more recently developed tests in terms of power.
Artikel Lesen