Students / Subjects


Experimenters
Email:

Password:


Forgot password?

Register

Glossary


A B C D E F G H I J K L M N O P Q R S T U V W X Y Z (Show all)

S-Plus

Statistical software published by Mathsoft.

Source: econterms

s.t.

An abbreviation meaning "subject to" or "such that", where constraints follow.
In a common usage:

maxx f(x) s.t. g(x)=0

The above expression, in words, means: "The value of f(x) that is greatest among all those for which the argument x satisfies the constraint that g(x)=0." (Here f() and g() are fixed, possibly known, real-valued functions of x.)

Source: econterms

saddle point

In a second-order [linear difference equation] system, ... if one root has absolute value greater than one, and the other root has absolute value less than one, then the steady state of the system is called a saddle point. In this case, the system is unstable for almost all initial conditions. The exception is the set of initial conditions that begin on the eigenvector associated with the stable eigenvalue.

Source: econterms

Sargan test

A test of the validity of instrumental variables. It is a test of the overidentifying restrictions. The hypothesis being tested is that the instrumental variables are uncorrelated to some set of residuals, and therefore they are acceptable, healthy, instruments.

If the null hypothesis is confirmed statistically (that is, not rejected), the instruments pass the test; they are valid by this criterion.

In the Shi and Svensson working paper (which shows that elected national governments in 1975-1995 had larger fiscal deficits in election years, especially in developing countries), the Sargan statistic was asymptotically distributed chi-squared if the null hypothesis were true.

See test of identifying restrictions, which is not exactly the same thing, I think.

Source: econterms

SAS

Statistical analysis software. SAS web site

Source: econterms

Saving

As an accounting concept, saving can be defined as the residual that is left from income after the consumption choice has been made as part of the household's utility maximization process. Substantially, saving is future consumption, and it is an important example of an intertemporal decision. The division of income between consumption and saving is driven by preferences between present and future consumption (or the utility derived from consumption).

The main determinants of the consumption-saving trade-off are the interest rate and the individual´s rate of time preference, reflecting the intertemporal substitution from one period to a future period: Income that is not used for consumption purposes can be saved and consumed one period later, earning an interest payment and hence allowing for more consumption in the future. This increase in the absolute amount available for consumption, as reflected in the interest rate, has then to be compared with the individual´s rate of time preference (the latter expressing her patience with respect to later consumption, or, more generally, to delayed utility derived from consumption. In the optimum, the interest rate and the rate of time preference have to be equal. This is one of the fundamentals of intertemporal choice (as a special form of rational behavior).

This intertemporal trade-off is the central building block of the life-cycle model of saving. Note that this model is firmly grounded in expected utility theory and assumes rational behavior. In recent years, there is much research on psychological aspects of savings. Wärneryd (1999) contains a good introduction to that literature.

Source: SFB 504

scale economies

Same as economies of scale.

Source: econterms

scatter diagram

A graph of unconnected points of data. If there are many of them the result may be 'clouds' of data which are hard to interpret; in such a case one might want to use a nonparametric technique to estimate a regression function.

Source: econterms

scedastic function

Given an independent variable x and a dependent variable y, the scedastic function is the conditional variance of y given x. That variance of the conditional distribution is:
var[y|x] = E[(y-E[y|x])2|x]

= integral or sum of (y-E[y|x])2f(y|x) dy
= E[y2|x] - (E[y|x])2.

Source: econterms

SCF

Stands for Survey of Consumer Finances.

Source: econterms

Schema

A schema is an abstract or generic knowledge structure, stored in memory, that specifies the defining features and relevant attributes of some stimulus domain, and the interrelations among those attributes. Schemata are often organized in an hierarchical order and they can consist of sub-schemata. These organized knowledge structures are highly important for information processes as they guide interpretation, help control attention, and affect memory encoding and retrieval. To give an example for a schema, the typical schema of "skinheads" might consist of attributes such as "bald", "aggressive" and "young".

Source: SFB 504

Schumpeterian growth

Paraphrasing from Mokyr (1990): Schumpeterian growth of economic growth brought about by increase in knowledge, most of which is called technological progress.

Source: econterms

Schwarz Criterion

A criterion for selecting among formal econometric models. The Schwarz Criterion is a number:
T ln (RSS) + K ln(T)
The criterion is minimized over choices of K to form a tradeoff between the fit of the model (which lowers the sum of squared residuals) and the model's complexity, which is measured by K. Thus an AR(K) model versus an AR(K+1) can be compared by this criterion for a given batch of data.

Source: econterms

Scitovsky paradox

The problem that some ways of aggregating social welfare may make it possible that a switch from allocation A to allocation B seems like an improvement in social welfare, but so does a move back. (An example may be Condorcet's voting paradox.)

Scitovsky, T., 1941, 'A Note on Welfare Propositions in Economics', Review of Economic Studies, Vol 9, Nov 1941, pp 77-88.

The Scitovsky criterion (for a social welfare function?) is that the Scitovsky paradox not exist.

Source: econterms

score

In maximum likelihood estimation, the score vector is the gradient of the likelihood function with respect to the parameters. So it has the same number of elements as the parameter vector does (often denoted k). The score is a random variable; it's a function of the data. It has expectation zero, and is set to zero exactly for a given sample in the maximum likelihood estimation process.

Denoting the score as S(q), and the likelihood function as L(q), where in both cases the data are also implied arguments:

S(q) = dL(q)/d(q)

Example: In OLS regression of Yt=Xtb+et, the score for each possible parameter value, b, is Xt'et(b).

The variance of the score is E[score2]-(E[score])2) which is E[score2] since E[score] is zero. E[score2] is also called the information matrix and is denoted I(q).

Source: econterms

screening game

A game in which an uninformed player offers a menu of choices to the player with private information (the informed player). The selection of the elements of that menu (which might be, for example, employment contracts containing pairs of pay rates and working hours) is a choice for the uninformed player to optimize on the basis of expectations about they possible types of the informed player.

Source: econterms

Script

A script is a knowledge structure, which describes the adequate sequence of events of familiar situations, for instance the script of a restaurant situation. It includes information about the invariant aspect of the situation, for example, all restaurants serve food. Moreover, it has slots for variables that apply to specific restaurants, for example, how expensive a particular restaurant is. Scripts combine single scenes to an integrated sequence from the point of view of a specific actor. Besides this temporal organization scripts ? like any other schema ? are organized in a hierarchical order.

Source: SFB 504

second moment

The second moment of a random variable is the expected value of the square of the draw of the random variable. That is, the second moment is EX2. Same as 'uncentered second moment' as distinguished from the variance which is the 'centered second moment.'

Source: econterms

Second price sealed bid auction

Simultaneous bidding game where the bidder that has submitted the highest bid is awarded the object, and he pays the highest competing bid only (which is the 'second highest' bid). In second price auctions with statictically independent private valuations, each bidder has a dominant strategy - lieber joachim, diesen eintrag bekommst du noch von karsten fieseler.- to bid exactly his valuation. The second price auction also is called Vickrey auction ; its multi-object form is the uniform price auction.

Source: SFB 504

Second Welfare Theorem

A Pareto efficient allocation can be achieved by a Walrasian equilibrium if every agent has a positive quantity of every good, and preferences are convex, continuous, and strictly increasing.
(My best understanding of 'convex preferences' is that it means 'concave utility function'.)

Source: econterms

secular

an adjective meaning "long term" as in the phrase "secular trends." Outside the research context its more common meaning is 'not religious'.

Source: econterms

See also

: decision strategies (in the social sciences), dominant strategies

Source: SFB 504

seigniorage

Alternate spelling for seignorage.

Source: econterms

seignorage

"The amount of real purchasing power that [a] government can extract from the public by printing money.'" -- Cukierman 1992
Explanation: When a government prints money, it is in essence borrowing interest-free since it receives goods in exchange for the money, and must accept the money in return only at some future time. It gains further if issuing new money reduces (through inflation) the value of old money by reducing the liability that the old money represents. These gains to a money-issuing government are called "seignorage" revenues.

The original meaning of seignorage was the fee taken by a money issuer (a government) for the cost of minting the money. Money itself, at that time, was intrinsically valuable because it was made of metal.

Source: econterms

Selection problem

An important condition for empirical work (using either field or experimental data) is that the sample be drawn randomly from the population. If this is not the case; then, the sample is said to be selected (i.e., elected according to some rule, not randomly). Statistical inference might not be valid when the sample is non-randomly selected.

Formally, the selection problem can be described as follows (see Manski, 1993). Each member of the population (say, an individual, a household, or a firm) is characterized by a triple (y,z,x). Suppose that the researcher is interested in a relationship between x, the independent variable, and y, the dependent variable. (The variables y and x can be discrete or real numbers, and they can be scalars or vectors.) The variable z is a binary indicator variable that takes only the values 0 and 1; for example, it might indicate whether an individual has answered a set of survey questions about y or not.

In general, the relationship in question can be described by a probability measure P(y|x); the most common example for such a relationship is the normal regression model. To learn about this relationship, the researcher draws a random sample from the population, observes all the realizations of (z,x), but observers realizations of z only when z = 1. The selection problem is the failure of the sampling process to identify the population probability measure P(y|x).

Without going into technical details any further, note that some conditional probabilities can still be identified from the data if one takes into account that the data are drawn conditional on z=1. Accordingly, statistical and econometric methods which deal with the selection problem combine the identified conditional probabilites with either additional a priori information or (untestable) identifiying assumptions about some of the unidentified conditional probabilities involved. These methods allow to characterize the relationship of x and y in the population (under the identifying assumptions made).

One example of commonly made assumptions is that of ignorable non-response ? i.e., the assumption that y and z are statistically independent, conditional on x. Recent research concentrates on those cases in which one is unwilling to make such strong assumptions and where there is no other prior information that can be exploited. While means are difficult to handle under these circumstances, some bounds can be derived for other population statistics such as quantiles and distribution functions.

Source: SFB 504

self-generating

Given an operator B() that operates on sets, a set W is self-generating if W is contained in B(W).

This definition is in Sargent (98) and may come from Abreu, Pearce, and Stacchetti (1990).

Source: econterms

semi-nonparametric

synonym for semiparametric.

Source: econterms

semi-strong form

Can refer to the semi-strong form of the efficient markets hypothesis, which is that any public information about a security is fully reflected in its current price.
Fama (1991) says that a more common and current name for tests of the semi-strong form hypothesis is 'event studies.'

Source: econterms

semilog

The semilog equation is an econometric model:

Y = ea+bX+e

or equivalently

ln Y = a + bX + e

Commonly used to describe exponential growth curves. (Greene 1993, p 239)

Source: econterms

semiparametric

An adjective that describes an econometric model with some components that are unknown functions, while others are specified as unknown finite dimensional parameters.

An example is the partially linear model.

Source: econterms

senior

Debts may vary in the order in which they must legally be paid in the event of bankruptcy of the individual or firm that owes the debt. The debts that must be paid first are said to be senior debts.

Source: econterms

SES

socioeconomic status

Source: econterms

shadow price

In the context of a maximization problem with a constraint, the shadow price on the constrain is the amount that the objective function of the maximization would increase by if the constraint were relaxed by one unit.

The value of a Lagrangian multiplier is a shadow price.

This is a striking and useful fact, but takes some practice to understand.

Source: econterms

shakeout

A period when the failure rate or exit rate of firms from an industry is unusually high.

Source: econterms

sharing rule

A function that defines the split of gains between a principal and agent. The gains are usually profits, and the split is usually a linear rule that gives a fraction to the agent. For example, suppose profits are x, which might be a random variable. The principal and agent might agree, in advance of knowing x, on a sharing rule s(x). Here s(x) is the amount given to the agent, leaving the principal with the residual gain x-s(x).

Source: econterms

Sharpe ratio

Computed in context of the Sharpe-Linter CAPM. Defined for an asset portfolio a that has mean ma, standard deviation sa, and with risk-free rate rf by:

[ma-rf]/sa

Higher Sharpe ratios are more desirable to the investor in this model.
The Sharpe ratio is a synonym for the "market price of risk." Empirically, for the NYSE, the Sharpe ratio is in the range of .30 to .40.

Source: econterms

SHAZAM

Econometric software published at the University of British Columbia. See http://shazam.econ.ubc.ca.

Source: econterms

Shephard's lemma

Source: econterms

Sherman Act

1890 U.S. antitrust law. It has been described as vague, leading to ambiguous interpretations over the years.
Section one of the law forbids certain joint actions: "Every contract, combination in the form of trust or otherwise, or conspiracy, in restraint of trade or commerce among the several states, or with foreign nations, is hereby declared illegal...."
Section two of the law forbids certain individual actions: "Every person who shall monopolize, or attempt to monopolize, or combine or conspire with any other person or persons, to monopolize any part of the trade or commerce among the several states, or with foreign nations, shall be deemed guilty of a felony..."
The reasons for the passage of the Sherman Act:
(1) To promote competition to benefit consumers,
(2) Concern for injured competitors,
(3) Distrust of concentration of power.

Source: econterms

short rate

Abbreviation for 'short term interest rate'; that is, the interest rate charged (usually in some particular market) for short term loans.

Source: econterms

Shortfall risk measures

Starting-point of the shortfall risk measures is a target return, for example an one-month market return, defined by the investor. Risk is than to be considered as the possibility not to come up to this target return. Special cases of shortfall risk measures are the shortfall probability, the shortfall expectation and the shortfall variance.

Source: SFB 504

Shubik model

A theoretical model designed to study the behavior of money. There are N goods traded in N(N-1) markets, one for each possible combination of good i and good j that could be exchanged. One assumes that only N of these markets are open; that good 0, acting as money, is traded for each of the other commodities but they are not exchanged for one another. Then one can study the behavior of the money good.

Source: econterms

SIC

Standard Industrial Classification code -- a four-digit number assigned to U.S. industries and their products. By "two-digit industries" we mean a coarser categorization, grouping the industries whose first two digits are the same.

Source: econterms

sieve estimators

flexible basis functions to approximate a function being estimated. It may be that orthogonal series, splines, and neural networks are examples. Donald (1997) and Gallant and Nychka (1987) may have more information.

Source: econterms

sigma-algebra

A collection of sets that satisfy certain properties with respect to their union. (Intuitively, the collection must include any result of complementations, unions, and intersections of its elements. The effect is to define properties of a collection of sets such that one can define probability on them in a consistent way.) Formally:
Let S be a set and A be a collection of subsets of S. A is a sigma-algebra of S if:
(i) the null set and S itself are members of A
(ii) the complement of any set in A is also in A
(iii) countable unions of sets in A are also in A.
It follows from these that a sigma-algebra is closed under countable complementation, unions, and intersections.

Source: econterms

signaling game

A game in which a player with private information (the informed player) sends a signal of his private type to the uninformed player before the uninformed player makes a choice. An example: a candidate worker might suggest to the potential employer what wage is appropriate for himself in a negotiation.

Source: econterms

significance

A finding in economics may be said to be of economic significance (or substantive significance) if it shows a theory to be useful or not useful, or if has implications for scientific interpretation or policy practice (McCloskey and Ziliak, 1996). Statistical significance is property of the probability that a given finding was produced by a stated model but at random: see significance level.

These meanings are different but sometimes overlap. McCloskey and Ziliak (1996) have a substantial discussion of them. Ambiguity is common in practice, but not hard to avoid. (Editorial comment follows.) When the second meaning is intended, use the phrase "statistically significant" and refer to a level of statistical significance or a p-value. Avoid the aggressive word "insignificant" unless it is clear whether the word is to be taken to mean substantively insignificant or not statistically significant.

Source: econterms

significance level

The significance level of a test is the probability that the test statistic will reject the null hypothesis when the [hypothesis] is true. Significance is a property of the distribution of a test statistic, not of any particular draw of the statistic.

Source: econterms

Simpson paradox

The Simpson-Paradox refers to a tri-variate statistical problem in which a contingency between two variables, x and y, can be accounted for by a third variable, z. Solving the problem thus calls for a cognitive routine analogous to analysis of covariance. Cognitive research on the Simpson-Paradox addresses the question of whether the human mind can correctly handle these statistical tools.

One example for the Simpson-Paradox: One supermarket may be more expensive on aggregate (correlation between supermarket (x) and price (y)), but only because the latter supermarket sells more high-quality products (correlation between supermarket (x) and the percentage of high-quality products (z), correlation between price (y) and the percentage of high-quality products (z)). Considering high and low-quality products separately, the seemingly more expensive supermarket may turn out to be cheaper at any quality level.

Source: SFB 504

simulated annealing

A method of finding optimal values numerically. Simulated annealing is a search method as opposed to a gradient based algorithm. It chooses a new point, and (for optimization) all uphill points are accepted while some downhill points are accepted depending on a probabilistic criteria.

Unlike the simplex search method provided by Matlab, simulated annealing may allow "bad" moves thereby allowing for escape from a local max. The value of a move is evaluated according to a temperature criteria (which essentially determines whether the algorithm is in an "hot" area of the function).

Source: econterms

simultaneous equation system

By 'system' is meant that there are multiple, related, estimable equations. By simultaneous is meant that two quantities are jointly determined at time t by one another's values at time t-1 and possibly at t also.
Example, from Greene, (1993, p. 579), of market equilibrium:

  qd=a1p+a2+ed (Demand equation)  qs=b1p+es    (Supply equation)  qd=qs=q  
Here the quantity supplied is qs, quantity demanded is qd, price is p, the e's are errors or residuals, and the a's and b's are parameters to be estimated. We have data on p and q, and the quantities supplied and demanded are conjectural.

Source: econterms

single-crossing property

Distributions with cdfs F and G satisfy the single-crossing property if there is an x0 such that: F(x) >= G(x) for x<=x0 and G(x) >= F(x) for x>=x0

Source: econterms

sink

"In a second-order [linear difference equation] system, if both roots are positive and less than one, then the system converges monotonically to the steady state. If the roots are complex and lie inside the unit circle then the system spirals into the steady state. If at least one root is negative, but both roots are less than one in absolute value, then the system will flip from one side of the steady state to the other as it converges. In all of these cases the steady state is called a sink." Contrast 'source'.

Source: econterms

SIPP

The U.S. Survey of Income and Program Participation, which is conducted by the U.S. Census Bureau.

A tutorial is at: http://www.bls.census.gov/sipp/tutorial/SIPP_Tutorial_Beta_version/LAUNCHtutorial.html

Source: econterms

size

a synonym for significance level

Source: econterms

skewness

An attribute of a distribution. A distribution that is symmetric around its mean has skewness zero, and is 'not skewed'. Skewness is calculated as E[(x-mu)3]/s3 where mu is the mean and s is the standard deviation.

Source: econterms

skill

In regular English usage means "proficiency". Sometimes used in economics papers to represent the experience and formal education. (Ed.: in this editor's opinion that is a dangerously misleading use of the term; it invites errors of thought and understanding.)

Source: econterms

SLID

Stands for Survey of Labour and Income Dynamics. A Canadian government database going back to 1993 at least. Web pages on this subject can be searched from: http://www.statcan.ca/english/search/index.htm.

Source: econterms

SLLN

Stands for strong law of large numbers.

Source: econterms

SMA

Structural Moving Average model, which see

Source: econterms

Smithian growth

Paraphrasing directly from Mokyr, 1990: Economic growth brought about by increases in trade.

Source: econterms

smoothers

Smoothers are estimators that produce smooth estimates of regression functions. They are nonparametric estimators. The most common and implementable types are kernel estimators, k-nearest-neighbor estimators, and cubic spline smoothers.

Source: econterms

smoothing

Smoothing of a data set {Xi, Yi} is the act of approximating m() in a regression such as:
Yi = m(Xi) + ei
The result of a smoothing is a smooth functional estimate of m().

Source: econterms

SMR

Standardized mortality ratio

Source: econterms

SMSA

Stands for Standard Metropolitan Statistical Area, a U.S. term for the standard boundaries of urban regions used in academic studies.

Source: econterms

SNP

abbreviation for 'seminonparametric', which means the same thing as semiparametric.

Source: econterms

social capital

The relationships of a person which lead to economically productive consequences. E.g., they may produce something analogous to investment returns to that person, or socially productive consequences to a larger society. ''Social capital' refers to the benefits of strong social bonds. [Sociologist James] Coleman defined the term to take in 'the norms, the social interworks, the relationships between adults and children that are of value for the children's growing up.' The support of a strong community helps the child accumulate social capital in myriad ways; in the [1990s U.S.] inner city, where institutions have disintegrated, and mothers often keep children locked inside out of fear for their safety, social capital hardly exists.' -- Traub (2000)

Source: econterms

Social cognition

Social cognition is a label for processes that are instigated by social information. As a meta-theoretical perspective, social cognition has been applied to a variety of topic domains including person perception, stress and coping, survey responding, attitude-behavior relations, group processes, political decisions and even more domains. The basic processes that presumably operate in the range of domains are conceptualized within the research on perception of behavior, encoding, inferring, explaining, storing, retrieving, judging, generating overt responses. Research in this domain takes this information flow beyond a single individual into social interaction.

Source: SFB 504

social planner

One solving a Pareto optimality problem. The problem faced by a social planner will have as an answer an allocation, without prices.
Also, "the social planner is subject to the same information limitations as the agents in the economy." -- Cooley and Hansen p 185 That is, the social planner does not see information that is hidden by the rules of the game from some of the agents. If an agent happens not to know something, but it is not hidden from him by the rules of the game, then the social planner DOES see it.

Source: econterms

Social psychology

Social psychology ?attempts to understand and explain how thoughts, feelings or behaviour of individuals are influenced by the actual imagined or implicit presence of others? (Allport, 1985).
Issues studies by social psychologists range from intrapersonal processes (e.g. person perception, attitude theory, social cognition) and interpersonal relations (e.g. aggression, interdependence) to intergroup behaviour.

Source: SFB 504

social savings

A measurement of a new technology discussed in Crafts (2002). 'How much more did [a new technology] contribute than an alternative investment might have yielded?' and cites Fogel (1979).

Source: econterms

social welfare function

A mapping from allocations of goods or rights among people to the real numbers.
Such a social welfare function (abbreviated SWF) might describe the preferences of an individual over social states, or might describe outcomes of a process that made allocations, whether or not individuals had preferences over those outcomes.

Source: econterms

SOFFEX

Swiss Options and Financial Futures Exchange

Source: econterms

Solas

Software for imputing values to missing data, published by Statistical Solutions.

Source: econterms

Solovian growth

Paraphrasing from Mokyr (1990): Economic growth brought about by investment, meaning increases in the capital stock.

Source: econterms

Solow growth model

Paraphrasing pretty directly from Romer, 1996, p 7:
The Solow model is meant to describe the production function of an entire economy, so all variables are aggregates. The date or time is denoted t. Output or production is denoted Y(t). Capital is K(t). Labor time is denoted L(t). Labor's effectiveness, or knowledge, is A(t). The production function is denoted F() and is assumed to have constant returns to scale. At each time t, the production function is:
Y = F(K, AL)
which can be written: Y(t) = F(K(t), A(t)L(t))

AL is effective labor.

Note variants of the way A enters into the production function. This one is called labor-augmenting or Harrod-neutral. Others are capital-augmenting, e.g. Y=F(AK,L), or , like Y=AF(K,L). -------------------- From _Mosaic of Economic Growth_: DEFN of Solow-style growth models: They come from the seminal Solow (1956). 'In Solow-style models, there exists a unique and globally stable growth path to which the level of labor productivity (and per capita output) will converge, and along which the rate of advance is fixed (exogenously) by the rate of technological progress.' Many subsequent models of agg growth (like Romer 1986) have abandoned the assumption that all forms of kap accumulation run into diminishing marginal returns, and get different global convergence implications. (p22)

Source: econterms

Solow residual

A measure of the change in total factor productivity in a Solow growth model. This is a way of doing growth accounting empirically either for an industry or more commonly for a macroeconomy. Formally, roughly following Hornstein and Krusell (1996):

Suppose that in year t an economy produces output quantity yt with exactly two inputs: capital quantity kt and labor quantity lt. Assume perfectly competitive markets and that production has constant returns to scale. Let capital's share of income be fixed over time and denoted a. Then the change in total factor productivity between period t and period t+1, which is the Solow residual, is defined by:

Solow residual = (log TFPt+1) - (log TFPt)

= (log yt+1) - (log yt)
- a(log kt+1) - a(log kt)
- (1-a)(log lt+1) - (1-a)(log lt)
Analogous definitions exist for more complicated models (with other factors besides capital and labor) or on an industry-by-industry basis, or with capital's share varying by time or by industry.

The equation may look daunting but the derivations are not difficult and students are sometimes asked to practice them until they are routine. Hulten (2000) says about the residual that:
-- it measures shifts in the implicit aggregate production function.
-- it is a nonparametric index number which measures that shift in a computation that uses prices to measure marginal products.
-- the factors causing the measured shift include technical innovation, organizational and institutional changes, fluctuations in demand, changes in factor shares (where factors are capital, labor, and sometimes measures of energy use, materials use, and purchased services use), and measurement errors.

From an informal discussion by this editor, it looks like the residual contains these empirical factors, among others: public goods like highways; externalities from networks like the Internet; some externalities and losses of capital services from disasters like September 11; theft; shirking; and technical / technological change.

Source: econterms

solution concept

Phrase relevant to game theory. A game has a 'solution' which may represent a model's prediction. The modeler often must choose one of several substantively different solution methods, or solution concepts, which can lead to different game outcomes. Often one is chosen because it leads to a unique prediction. Possible solution concepts include:

iterative elimination of strictly dominated strategies
Nash equilibrium
Subgame perfect equilibrium
Perfect Bayesian equilibrium

Source: econterms

source

"In a second-order [linear difference equation] system, ... if both roots are positive and greater than one, then the system diverges monotonically to plus or minus infinity. If the roots are complex and [lie] outside the unit circle then the system spirals out away from the steady state. If at least one root is negative, but both roots are greater than one in absolute value, then the system will flip from one side of the steady state to the other as it diverges to infinity. In each of these cases the steady state is called a source." Constrast 'sink'.

Source: econterms

sparse

A matrix is sparse if many of its values are zero. A division of sample data into discrete bins (that is into a multinomial table) is sparse if many of the bins have no data in them.

Source: econterms

spatial autocorrelation

Usually autocorrelation means correlation among the data from different time periods. Spatial autocorrelation means correlation among the data from locations. There could be many dimensions of spatial autocorrelation, unlike autocorrelation between periods. Nick J. Cox () wrote, in a broadcast to a listserv discussing the software Stata, this discussion of spatial autocorrelation. It is quoted here without any explicit permission whatsoever. (Parts clipped out are marked by 'snip'.) If 'Moran measure' and 'Geary measure' are standard terms used in economics I'll add them to the glossary.

Date: Thu, 15 Apr 1999 12:29:10 GMT
From: "Nick Cox" 
Subject: statalist: Spatial autocorrelation

[snip...]

First, the kind of spatial data considered here is data in two-dimensional
space, such as rainfall at a set of stations or disease incidence in a set of
areas, not three-dimensional or point pattern data (there is a tree or a
disease case at coordinates x, y).  Those of you who know time series might
expect from the name `spatial autocorrelation' estimation of a function,
autocorrelation as a function of distance and perhaps direction.  What is
given here are rather single-value measures that provide tests of
autocorrelation for problems where the possibility of local influences is of
most interest, for example, disease spreading by contagion. The set-up is that
the value for each location (point or area) is compared with values for its
`neighbours', defined in some way.

The names Moran and Geary are attached to these measures to honour the pioneer
work of two very fine statisticians around 1950, but the modern theory is due
to the statistical geographer Andrew Cliff and the statistician Keith Ord.

For a vector of deviations from the mean z, a vector of ones 1, and a matrix
describing the neighbourliness of each pair of locations W, the Moran measure
for example is

       (z' W z) / (z' z)
   I = -----------------
       (1' W 1) / (1' 1)

where ' indicates transpose. This measure is for raw data, not regression
residuals.

[snip; and the remainder discusses a particular implementation of a spatial
autocorrelation measuring function in Stata.]

For n values of a spatial variable x defined for various locations,
which might be points or areas, calculate the deviations
            _
    z = x - x

and for pairs of locations i and j, define a matrix

    W = ( w   )
           ij

describing which locations are neighbours in some precise sense.
For example, w   might be assigned 1 if i and j are contiguous areas
              ij
and 0 otherwise; or w   might be a function of the distance between
                     ij
i and j and/or the length of boundary shared by i and j.

The Moran measure of autocorrelation is

        n   n                      n   n         n   2
   n ( SUM SUM z  w   z  ) / ( 2 (SUM SUM w  )  SUM z  )
       i=1 j=1  i  ij  j          i=1 j=1  ij   i=1  i

and the Geary measure of autocorrelation is

             n   n               2           n   n         n   2
   (n -1) ( SUM SUM w   (z  - z )  ) / ( 4 (SUM SUM w  )  SUM z  )
            i=1 j=1  ij   i    j            i=1 j=1  ij   i=1  i

and these measures may used to test the null hypothesis of no spatial
autocorrelation, using both a sampling distribution assuming that x
is normally distributed and a sampling distribution assuming randomisation,
that is, we treat the data as one of n! assignments of the n values to
the n locations.

In a toy example, area 1 neighbours 2, 3 and 4  and has value 3
                       2            1 and 4                   2
                       3            1 and 4                   2
                       4            1, 2 and 3                1

This would be matched by the data

^_n^ (obs no)    ^value^ (numeric variable)  ^nabors^ (string variable)
- -----------    ------------------------  ------------------------
    1                      3                    "2 3 4"
    2                      2                      "1 4"
    3                      2                      "1 4"
    4                      1                    "1 2 3"

That is, ^nabors^ contains the observation numbers of the neighbours of
the location in the current observation, separated by spaces. Therefore,
the data must be in precisely this sort order when ^spautoc^ is called.

Note various assumptions made here:

1. The neighbourhood information can be fitted into at most a ^str80^
variable.

2. If i neighbours j, then j also neighbours i and both facts are
specified.

By default this data structure implies that those locations listed
have weights in W that are 1, while all other pairs of locations are not
neighbours and have weights in W that are 0.

If the weights in W are not binary (1 or 0), use the ^weights^ option.
The variable specified must be another string variable.

^_n^ (obs no)  ^nabors^ (string variable)  ^weight^ (string variable)
- -----------  ------------------------  ------------------------
    1                "2 3 4"             ".1234 .5678 .9012"
    etc.

that is, w   = 0.1234, and so forth. w   need not equal w  .
          12                          ij                 ji

[snip]

References
- ----------

Cliff, A.D. and Ord, J.K. 1973. Spatial autocorrelation. London: Pion.

Cliff, A.D. and Ord, J.K. 1981. Spatial processes: models and
applications. London: Pion.

Author
- ------
         Nicholas J. Cox, University of Durham, U.K.
         n.j.cox@@durham.ac.uk

- ------------------------- end spautoc.hlp

Nick
n.j.cox@durham.ac.uk

Source: econterms

SPE

Abbreviation for: Subgame perfect equilibrium

Source: econterms

specie

A commodity metal backing money; historically specie was gold or silver.

Source: econterms

spectral decomposition

The factorization of a positive definite matrix A into A=CLC' where L is a diagonal matrix of eigenvalues, and the C matrix has the eigenvectors. That decomposition can be written as a sum of outer products:

A = (sum from i=1 to i=N of) Licici'

where ci is the ith column of C.

Source: econterms

spectrum

Summarizes the periodicity properties of a time series or time series sample xt. Often represented in a graph with frequency, or period, (often denoted little omega) on the horizontal axis, and Sx (omega), which is defined below, on the vertical axis. Sx is zero for frequencies that are not found in the time series or sample, and is increasingly positive for frequencies that are more important in the data.

Sx(omega) = (2pi)-1(sum for j from -infinity to +infinity of) gammaje-ijomega

where gammaj is the jth autocovariance, omega is in the range [-pi, pi], and i is the square root of -1.

Example 1: If xt is white noise, the spectrum is flat. All cycles are equally important. If they were not, the series would be forecastable.

Example 2: If xt is an AR(1) process, with coefficient in (0, 1), the spectrum has a peak at frequency zero and declines monotonically with distance from zero. This process does not have an observable cycle.

Source: econterms

speculative demand

The speculative demand for money is inversely related to the interest rate.

Source: econterms

spline function

The kind of estimate producted by a spline regression in which the slope varies for different ranges of the regressors. The spline function is continuous but usually not differentiable.

Source: econterms

spline regression

A regression which estimates different linear slopes for different ranges of the independent variables. The endpoints of the ranges are called knots.

Source: econterms

spline smoothing

A particular nonparametric estimator of a function. Given a data set {Xi, Yi} it estimates values of Y for X's other than those in the sample. The process is to construct a function that balances the twin needs of (1) proximity to the actual sample points, (2) smoothness. So a 'roughness penalty' is defined. See Hardle's equation 3.4.1 near p. 56 for the 'cubic spline' which seems to be the most common.

Source: econterms

SPO

stands for Strongly Pareto Optimal, which see.

Source: econterms

SPSS

Stands for 'Statistical Product and Service Solutions', a corporation at www.spss.com

Source: econterms

SSEP

Social Science Electronic Publishing, Inc.

Source: econterms

SSRN

Social Science Research Network Their web site

Source: econterms

stabilization policy

'Macroeconomic stabilization policy consists of all the actions taken by governments to (1) keep inflation low and stable; and (2) keep the short-run (business cycle) fluctuations in output and employment small.' Includes monetary and fiscal policies, international and exchange rate policy, and international coordination. (p129 in Taylor (1996)).

Source: econterms

stable distributions

See Campbell, Lo, and MacKinlay pp 17-18. Ref to French probability theories Levy. The normal, Cauchy, and Bernoulli distrbutions are special cases. Except for the normal distrbituion, they have infinite variance.
There has been some study of whether continuously compounded asset returns could fit a stable distribution, given that their kurtosis is too high for a normal distribution.

Source: econterms

stable steady state

in a dynamical system with deterministic generator function F() such that Nt+1=F(Nt), a steady state is stable if, loosely, all nearby trajectories go to it.

Source: econterms

staggered contracting

A model can be constructed in which some agents, usually firms, cannot change their prices at will. They make a contract at some price for a specified duration, then when that time is up can change the price. If the terms of the contracts overlap, that is they do not all end at the same time, we say the contracts are staggered.

An important paper on this topic was Taylor (1980) which showed that staggered contracts can have an effect of persistence -- that is, that one-time shocks can have effects that are still evolving for several periods. This is a version of a new Keynesian, sticky-price model.

Source: econterms

standard normal

Refers to a normal distribution with mean of zero and variance of one.

Source: econterms

Standard operating procedures

Standard operating procedures are part of the formal structure of organizations. They serve to coordinate divisional labour processes. At the same time, they are part of the decision environment organizations equipe their members with: SOPs limit the aspect of reality which is relevant for certain decisions. Thus, they reduce the complexity of decision problems. As well, SOPs can be the result of trial-and-error processes. Thus, they might be regarded as an accumulation of organizational experience. According to this view, SOPs heighten the level of problem handling, respectively the rationality of decision makers in organizations. They enable individuals with limited rationality to engage in more effective information gathering and -processing. However, decisions in Organizations are not just executions of SOPs. Rather, SOPs need to be complemented by interpretations of decision makers. Eventually, strict observance of SOPs might even result in dysfunctional or irrational decisions.

Source: SFB 504

Stata

Statistical analysis software. Stata web site

Source: econterms

state price

the price at time zero of a state-contingent claim that pays one unit of consumption in a certain future state.

Source: econterms

state price vector

the vector of state prices for all states.

Source: econterms

state-space approach to linearization

Approximating decision rules by linearizing the Euler equations of the maximization problem around the stationary steady state and finding a unique solution to the resulting system of dynamic equations

Source: econterms

States

are temporary conditions within an individual such as anger, stress or fear; opposed to traits that are more permanent.

Source: SFB 504

statistic

a function of one or more random variables that does not depend upon any unknown parameter.
(The distribution of the statistic may depend on one or more unknown parameters, but the statistic can be calculated without knowing them just from the realizations of the random variables, e.g. the data in a sample.)
In general a statistic could be a vector of values, but often it is a scalar.

Source: econterms

Statistica

Statistical software. See http://www.statsoft.com.

Source: econterms

statistical discrimination

A theory of why minority groups are paid less when hired. The theory is roughly that managers, who are of one type (say, white), are more culturally attuned to the applicants of their own type than to applicants of another type (say, black), and therefore they have a better measure of the likely productivity of the applicants of their own type. (There is uncertainty in the manager's predictions about blacks and probably of whites too, but more uncertainty for blacks.) Because the managers are risk averse they bid more for a white applicant of a given apparent productivity than for a black one, since their measure of the white's productivity is better. This theory predicts that white managers would offer black applicants lower starting wages than whites of the same apparent ability, even if the manager is not prejudiced against the blacks.

Source: econterms

statistics

statistics

Source: econterms

stochastic

synonym for random.

Source: econterms

stochastic difference equation

A linear difference equation with random forcing variables on the right hand side. Here is a stochastic difference equation in k:
kt+1 + kt = wt
where the k's and w's are scalars, and time t goes from 0 to infinity. The w's were exogenous forcing variables. Or:
Akt+1 + Bkt + Ckt-1 = Dwt + et
where the k's are vectors, the w's and e's are exogenous vectors, and A, B, C, and D are constant matrices.

Source: econterms

stochastic dominance

An abbreviation for first-order stochastic dominance. A possible comparison relationship between two stochastic distributions. Let the possible returns from assets A and B be described by statistical distributions A and B. Payoff distribution A first-order stochastically dominates payoff distribution B if for every possible payoff, the probability of getting a payoff that high is never better in B than in A.

Much more is in Huang and Litzenberger (1988), chapter 2.

Source: econterms

stochastic process

is an ordered collection of random variables. Discrete ones are indexed, often by the subscript t for time, e.g., yt, yt+1, although such a process could be spatial instead of temporal. Continuous ones can be described as continuous functions of time, e.g. y(t).

A stochastic process is specified by properties of the joint distribution for those random variables. Examples:

-- the random variables are independently and identically distributed (iid).
-- the process is a Markov process
-- the process is a martingale
-- the process is white noise
-- the process is autoregressive (e.g. AR(1))
-- the process has a moving average (e.g. see MA(1))

Source: econterms

Stolper-Samuelson theorem

In some models of international trade, trade lowers the real wage of the scarce factor of production, and protection from trade raises it. That is a Stolper-Samuelson effect, by analogy to their (1941) theorem in a Heckscher-Ohlin model context.

A notable case is when trade between a modernized economy and a developing one would lower the wages of the unskilled in the modernized economy because the developing country has so many of the unskilled.

Source: econterms

stopping rule

A stopping rule, in the context of search theory, is a mapping from histories of draws to one of two decisions: stop at this draw, or continue drawing.

Source: econterms

storable

A good is storable to the degree that it does not degrade or lose its value over time. In models of money, storable goods dominate less storable goods as media of exchange.

Source: econterms

straddle

An options trading strategy of buying a call option and a put option on the same stock with the same strike price and expiration date. Such a strategy would result in a profitable position if the stock price is far enough from the strike price.

Source: econterms

Strategic Equilibrium

Profile of plans leading to an outcome of a game that is stable in the sense that given the other players adhere to the equilibrium prescription, no single player wants to deviate from the prescription. Any outcome that is reached by the play of strategies which do not form an equilibrium is an implausible way of playing the game, because at least one player could improve by selecting another strategy.

The concept of strategic equilibrium is completely unrelated to (Pareto) efficiency. Correspondigly, infinitely many games have (only) inefficient strategic equilibria; for a striking example, see the Prisoners' Dilemma game. As a strategic equilibrium is a profile of strategies that is unilaterally unimprovable given that all (other) players conform to their equilibrium strategies, the concept is weak and very general, but on the other hand most games possess several strategic equilibria. One of the major achievements of game theory accordingly has been the refinement of the concept of strategic equilibrium to allow for sharper predictions.

Two major achievements in refining the concept of equilibrium center around the 'time consistency' of strategically stable plans for sequential games, and on making precise the role of the players' beliefs about other players' plans of actions and information. A more general definition of strategic equilibrium is the following: an equilibrium is a profile of strategies and a profile of beliefs such that given the beliefs, the strategies are unilaterally unimprovable given equilibrium behavior, and such that the beliefs are consistent with the actual courses of action prescribed by the equilibrium strategies.

Source: SFB 504

Strategy in game theory and economics

In non-cooperative game theory, strategies are the primitives the player can choose between. A player's strategy is the action or the plan of actions this player chooses out of his set of strategies. For example in an auction, the strategy of a player describes the way this player bids.

In the most simple games (with complete information and simultaneous moves) the strategy of a player simply specifies which action this player takes. To be general enough to cover also more complex games (like dynamic games of incomplete information), the notion of strategy is very comprehensive: A strategy of a player is a complete plan of action in the game. In particular, for each point of time where the player is called upon to act, it describes which action to choose. And it does so for every combination of previous moves (of this player and of his opponents) and for each type of player.

For example, in a dynamic game like chess, a strategy specifies not only the move of a player in the first round, but also in every consecutive round for every possible combination of previous rounds (e.g. if a strategy in a suitable game specifies that a player commits suicide today, it must also specify what he would do tomorrow if he still were alive.)

In games of incomplete information there are different types of players, e.g. in auctions there are types with different valuations for the object for sale. Here a strategy specifies a complete plan of action for every such type.

Source: SFB 504

strategy-proof

A decision rule (a mapping from expressed preferences by each of a group of agents to a common decision) "is strategy-proof if in its associated revelation game, it is a dominant strategy for each agent to reveal its true preferences."

Source: econterms

strict stationarity

Describes a stochastic process whose joint distribution of observations is not a function of time. Contrast weak stationarity.

Source: econterms

strict version of Jensen's inequality

Quoting directly from Newey-McFadden: "[I]f a(y) is a stricly concave function [e.g. a(y)=ln(y)] and Y is a nonconstant random variable, then a(E[Y]) > E[a(Y)]."

Source: econterms

Strictly dominated strategy

A strategy is strictly dominated, if there is a second strategy, such that the second strategy yields a strictly higher payoff than the first one, for every possible combination of strategies of the opponents. Rational Game Theory expects that strictly dominated strategies are never played. If for every player one strategy strictly dominates all other strategies of this player, game theorists expect the combination of these strictly dominant strategies to be the outcome of the game. Unfortunately, typically there are no strictly dominant strategies. Hence weaker equilibrium concepts have to be used to predict play in such games.

Source: SFB 504

strictly stationary

A random process {xt} is strictly stationary if the joint distribution of elements is not a function of the index t. This is a stronger condition than weak stationarity (which see; it's easier to understand) for any random process with first and second moments, because it requires also that the third moments, etc, be stationary.

Source: econterms

strip financing

Corporate financing by selling "stapled" packages of securities together that cannot be sold separately. E.g., if a firm might sell bonds only in a package that includes a standard proportion of senior subordinated debt, convertible debt, preferred, and common stock. A benefit is reduced conflict. In principle bondholders and stockholders have different interests and that can impose costs on the firm. After a strip financing, however, those groups are each made up of all the same people, so their interests coincide.

Source: econterms

strips

securities made up of standardized proportions of other securities from the same firm. See strip financing.

U.S. Treasury bonds can be split into principal and interest components, and the standard name for the resulting securities is STRIPS (Separate Trading of Registered Interest and Principal of Securities). See coupon strip and principal strip.

Source: econterms

strong form

Can refer to the strong form of the efficient markets hypothesis, which is that any public or private information known to anyone about a security is fully reflected in its current price.
Fama (1991) renames tests of the strong form of the hypothesis to be 'tests for private information.' Roughly -- If individuals with private information can make trading gains with it, the strong form hypothesis does not hold.

Source: econterms

strong incentive

An incentive that encourages maximization of an objective. For example, payment per unit of output produced encourages maximum production. Useful in design of a contract if the buyer knows exactly what is desired. Contrast weak incentive.

Source: econterms

strong law of large numbers

If {Zt} is a sequence of n iid random variables drawn from a distribution with mean MU, then with probability one, the limit of sample averages of the Z's goes to MU as sample size n goes to infinity.

I believe that strong laws of large numbers are generally, or perhaps always, proved using some version of Chebyshev's inequality. (The proof is rarely shown; in most contexts in economics one can simply assume laws of large numbers).

Source: econterms

strongly consistent

An estimator for a parameter is strongly consistent if the estimator goes to the true value almost surely as the sample size n goes to infinity. This is a stronger condition than weak consistency; that is, all strongly consistent estimators are weakly consistent but the reverse is not true.

Source: econterms

strongly dependent

A time series process {xt} is strongly dependent if it is not weakly dependent; that is, if it is strongly autocorrelated, either positively or negatively.

Example 1: A random walk with correlation 1 between observations is strongly dependent.

Example 2: An iid process is not strongly dependent.

Source: econterms

strongly ergodic

A stochastic process may be strongly ergodic even if it is nonstationary. A strongly ergodic process is also weakly ergodic.

Source: econterms

Strongly Pareto Optimal

A strongly Pareto optimal allocation is one such that no other allocation would be both (a) as good for everyone and (b) strictly preferred by some.

Source: econterms

structural break

A structural change detected in a time series sample.

Source: econterms

structural change

A change in the parameters of a structure generating a time series. There exist tests for whether the parameters changed. One is the Chow test.

Examples: (planned)

Source: econterms

structural moving average model

The model is a multivariate, discrete time, dynamic econometric model. Let yt be an ny x 1 vector of observable economic variables, C(L) is a ny x ne matrix of lag polynomials, and et be a vector of exogenous unobservable shocks, e.g. to labor supply, the quantity of money, and labor productivity. Then:
yt=C(L)et
is a structural moving average model.

Source: econterms

structural parameters

Underlying parameters in a model or class of models.

If a theoretical model explains two effects of variable x on variable y, one of which is positive and one negative, they are structurally separate. In another model, in which only the net effect of x on y is relevant, one structural parameter for the effect may be sufficient.

So a parameter is structural if a theoretical model has a distinct structure for its effect. The definition is not absolute, but relative to a model or class of models which are sometimes left implicit.

Source: econterms

structural unemployment

Unemployment that comes from there being an absence of demand for the workers that are available. Contrast frictional unemployment.

Source: econterms

structure

A model with its parameters fixed. One can discuss properties of a model with various parameters, but 'structural' properties are those that are fixed unless parameters change.

Source: econterms

Student t

Synonym for the t distribution. The name came about because the original researcher who described the t distribution wrote under the pseudonym 'Student'.

Source: econterms

stylized facts

Observations that have been made in so many contexts that they are widely understood to be empirical truths, to which theories must fit. Used especially in macroeconomic theory. Considered unhelpful in economic history where context is central. stylized facts

Source: econterms

subdifferential

A class of slopes. By example -- consider the top half of a stop sign as a function graphed on the xy-plane. It has well-defined derivatives except at the corners. The subdifferential is made up of only one slope, the derivative, at those points. At the corners there are many 'tangents' which define lines that are everywhere above the stop sign except at the corner. The slopes of those lines are members of the subdifferential at those points.

In general equilibrium usage, the subdifferential can be a class of prices. It's the set of prices such that expanding the total endowment constraint would not cause buying and selling, because the agents have optimized perfectly with respect to the prices. So if a set of prices is possible for a Walrasian equilibrium, it is in the subdifferential of that alocation.

Source: econterms

subgame perfect equilibrium

An equilibrium in which the strategies are a Nash equilibrium, and, within each subgame, the parts of the strategies relevant to the subgame make a Nash equilibrium of the subgame.

Source: econterms

Subgame perfect equilibrium

In extensive-form games with complete information, many strategy profiles that form best responses to one another imply incredible threats or promises that a player actually does not want to carry out anymore once he must face an (unexpected) off-equilibrium move of an opponent. If the profile of strategies is such that no player wants to amend his strategy whatever decision node can be reached during the play of the game, an equilibrium profile of strategies is called subgame perfect. In this sense, a subgame-perfect strategy profile is 'time consistent' in that it remains an equilibrium in whatever truncation of the original game (subgame) the players may find themselves.

Source: SFB 504

submartingale

A kind of stochastic process; one in which the expected value of next period's value, as projected on the basis of the current period's information, is greater than or equal to the current period's value.
This kind of process could be assumed for securities prices.

Source: econterms

subordinated

Adjective. A particular debt issue is said to be subordinated if it was senior but because of a subsequent issue of debt by the same firm is no longer senior. One says, 'subordinated debt'.

Source: econterms

substitution bias

A possible problem with a price index. Consumers can substitute goods in response to price changes. For example when the price of apples rises but the price of oranges does not, consumers are likely to switch their consumption a little bit away from apples and toward oranges, and thereby avoid experiencing the entire price increase. A substitution bias exists if a price index does not take this change in purchasing choices into account, e.g. if the collection ('basket') of goods whose prices are compared over time is fixed.

'For example, when used to measure consumption prices between 1987 and 1992, a fixed basket of commodities consumed in 1987 gives too much weight to the prcies that rise rapidly over the timespan and too little weight to the prices that have fall; as a result, using the 1987 fixed basket overstates the 1987-92 cost-of-living change. Conversely, because consumers substitute, a fixed basket of commodities consumed in 1992 gives too much weight to the prices that have fallen over the timespan and to little to the prices that have risen; as a result, the 1992 fixed based understates the 1987-92 cost-of-living change.' (Triplett, 1992)

Source: econterms

SUDAAN

A statistical software program designed especially to analyze clustered data and data from sample surveys. The SUDAAN Web site is at http://www.rti.org/patents/sudaan/sudaan.html.

Source: econterms

sufficient statistic

Suppose one has samples from a distribution, does not know exactly what that distribution is, but does know that it comes from a certain set of distributions that is determined partly or wholly by a certain parameter, q. A statistic is sufficient for inference about q if and only if the values of any sample from that distribution give no more information about q than does the value of the statistic on that sample.

E.g. if we know that a distribution is normal with variance 1 but has an unknown mean, the sample average is a sufficient statistic for the mean.

Source: econterms

sunk costs

Unrecoverable past expenditures. These should not normally be taken into account when determining whether to continue a project or abandon it, because they cannot be recovered either way. It is a common instinct to count them, however.

Source: econterms

sup

Stands for 'supremum'. A value is a supremum with respect to a set if it is at least as large as any element of that set. A supremum exists in context where a maximum does not, because (say) the set is open; e.g. the set (0,1) has no maximum but 1 is a supremum.

sup is a mathematical operator that maps from a set to a value that is syntactically like an element of that set, although it may not actually be a member of the set.

Source: econterms

superlative index numbers

'What Diewert called 'superlative' index numbers were those that provide a good approximation to a theoretical cost-of-living index for large classes of consumer demand and utility function specifications. In addition to the Tornqvist index, Diewert classified Irving Fisher's 'Ideal' index as belong to this class.' -- Gordon, 1990, p. 5

from harper (1999, p. 335): The term 'superlative index number' was coined by W. Erwin Diewert (1976) to describe index number formulas which generate aggregates consistent with flexible specifications of the production function.'

Two examples of superlative index number formulas are the Fisher Ideal Index and the Tornqvist index. These indexes 'accomodate subsitution in consumer spending while holding living standards constant, something the Paasche and Laspeyres indexes do not do.' (Triplett, 1992, p. 50).

Source: econterms

superneutrality

Money in a model 'is said to be superneutral if changes in [nominal] money growth have no effect on the real equilibrium.' Contrast neutrality.

Source: econterms

supply curve

For a given good, the supply curve is a relation between each possible price of the good and the quantity that would be supplied for market sale at that price.

Drawn in introductory classes with this arrangement of the axes, although price is thought of as the independent variable:

Price   |         / Supply
        |       /
        |     /
        |   /
        |________________________
                        Quantity

Source: econterms

support

of a distribution. Informally, the domain of the probability function; includes the set of outcomes that have positive probability. A little more exactly: a set of values that a random variable may take, such that the probability is one that it will take one of those values. Note that a support is not unique, because it could include outcomes with zero probability.

Source: econterms

SUR

Stands for Seemingly Unrelated Regressions. The situation is one where the errors across observations are thought to be correlated, and one would like to use this information to improve estimates. One makes an SUR estimate by calculating the covariance matrix, then running GLS.

The term comes from Arnold Zellner and may have been used first in Zellner (1962).

Source: econterms

SURE

same as SUR estimation.

Source: econterms

Survey of Consumer Finances

There is a U.S. survey and a Canadian survey by this name.

The U.S. one is a survey of U.S. households by the Federal Reserve which collects information on their assets and debt. The survey oversamples high income households because that's where the wealth is. The survey has been conducted every three years since 1983.

The Canadian one is an annual supplement to the Labor Force Survey that is carried out every April.

Source: econterms

survival function

From a model of durations between events (which are indexed here by i). Probability that an event has not happened since event (i-1), as a function of time.
E.g. denote that probability by Si():
Si(t | ti-1, ti-2, ...)

Source: econterms

SVAR

Structured VAR (Vector Autoregression).
The SVAR representation of a SMA model comes from inverting the matrix of lag polynomials C(L) (see the SMA definition) to get: A(L)yt=et
The SVAR is useful for (1) estimating A(L), (2) reconstructing the shocks et if A(L) is known.

Source: econterms

symmetric

A matrix M is symmetric if for every row i and column j, element M[i,j] = M[j,i].

Source: econterms

Synonyms

Rückschau-Fehler, knew-it-all-along effect, creeping determinism

Source: SFB 504

Copyright © 2006 Experimental Economics Center. All rights reserved. Send us feedback