"The control exerted by a discriminative stimulus is traditionally dealt with under the heading of attention. This concept reverses the direction of action by suggesting, not that a stimulus controls the behavior of an observer, but that the observer attends to the stimulus and thereby controls it. Nevertheless, we sometimes recognize that the object 'catches or holds the attention' of the observer."-- Skinner, B. F. (1953). Science and Human Behavior, 122.
"But attention is more than looking at something or looking at a class of things in succession...Attention is a controlling relation -- the relation between a response and a discriminative stimulus."-- Skinner, B. F. (1953). Science and Human Behavior, 123.
"We take it as axiomatic that behavior is a continuous stream. As Skinner noted, the stream may be divided for analytic purposes into reflex units the actual sizes of which are determined by the lawfulness they exhibit....The continuousness of behavior means that the organism can be thought of as 'always doing something,'..."-- Schoenfeld, W. N., & Farmer, J. (1970). The Theory of Reinforcement Schedules, 222.
"What type of behavior is it that we call 'conceptual'? And the answer is that when a group of objects get the same response, when they form a class the members of which are reacted to similarly, we speak of a concept....Classes of objects or events, differently responded to, develop different concepts....Generalization within classes and discrimination between classes-- this is the essence of concepts."-- Keller, F. S., & Schoenfeld, W. N. (1950). Principles of Psychology, 154.
"Any stimulus present when an operant is reinforced acquires control in the sense that the rate will be higher when it is present. Such a stimulus does not act as a goad; it does not elicit the response in the sense of forcing it to occur. It is simply an essential aspect of the occasion upon which a response is made and reinforced. The difference is made clear by calling it a discriminative stimulus..." -- Skinner, B. F. (1969).Contingencies of Reinforcement: A Theoretical Analysis, 7.
"We use operant discrimination in two ways. In the first place, stimuli which have already become discriminative are manipulated in order to change probabilities....In the second place, we mat set up a discrimination in order to make sure that a future stimulus will have a given effect when it appears." -- Skinner, B. F. (1953). Science and Human Behavior , 109.
"The eliciting stimulus was defined...as 'a part, or modification of a part, of the environment' correlated with the occurrence of a response."-- Skinner, B. F. (1938). Behavior of Organisms, 234.
"...the emotional stimulus affects the proportionality of reserve and rate. Facilitating and inhibitory stimuli may be included in this class."-- Skinner, B. F. (1938). Behavior of Organisms, 242.
"The positive reinforcement procedure conventionally is described as presenting a stimulus following a response, as in delivering a food pellet, with an emphasis on the addition to the environment. The negative reinforcement procedure is described as subtraction (removal) from the environment, as in removing electric shock after a response occurs...the distinction between presentation and removal may be arbitrary. For example, presenting food is equivalent to removing the stimuli associated with nonfood delivery and electric shock removal is indistinguishable from presenting stimuli correlated with a 'safe' period."-- Lattal, K.A. (1991).Experiemtnal Analysis of Behavior: Part 1, 88.
"...the strengthening of behavior which results from reinforcement is appropriately called 'conditioning'. In operant conditioning we 'strengthen' an operant in the sense of making a response more probable or, in actual fact, more frequent."-- Skinner, B. F. (1953). Science and Human Behavior, 65.
"The class of responses upon which a reinforcer is contingent is called an operant, to suggest the action on the environment followed by reinforcement.We construct an operant by making a reinforcer contingent on a response, but the important fact about the resulting unit is not its topography but its probability of occurrence, observed as rate of emission."-- Skinner, B. F. (1969). Contingencies of Reinforcement: A Theoretical Analysis, 7.
"To be observed, a response must affect the environment -- it must have an effect upon an observer or upon an instrument which in turn can affect an observer."-- Skinner, B. F. (1969). Contingencies of Reinforcement: A Theoretical Analysis, 130.
"An operant is a class, of which a response is an instance or member."-- Skinner, B. F. (1969). Contingencies of Reinforcement: A Theoretical Analysis, 131.
"In Pavlovian or 'respondent' conditioning we simply increase the magnitude of the response elicited by the conditioned stimulus and shorten the time which elapses between stimulus and response."-- Skinner, B. F. (1953). Science and Human Behavior, 65.
"...some events which follow responses have the effect of increasing the likelihood that the response will be repeated. Such events are defined as reinforcers, not in terms of any effect they might have upon the internal mechanisms of the organism, but strictly in terms of the effect they have in increasing the probability of a response."-- Wilcoxon, H.C. (1969). Reinforcement and Behavior, 30.
"The only way to tell whether a given event is reinforcing to a given organism under given conditions is to make a direct test. We observe the frequency of a selected response, then make an event contingent upon it and observe any change in frequency. If there is a change, we classify the event as reinforcing to the organism under the existing conditions."-- Skinner, B. F. (1953). Science and Human Behavior, 72-73.
"...The first cornerstone problem in any treatment of reinforcement schedules is the definition of the response. By specifying those measurable properties (duration, energy, etc.) which in a given experiment will be accepted as qualifying a 'response occurrence' for reinforcement, and thereby specifying the boundaries of the class defining the 'response' ( R ), the experimenter at once bestows a character and destiny upon his experiment which are critical for both its interpretation and its practical utility. Although setting the boundaries of a response class always involves arbitrary decisions, an experimenter cannot take a know-nothing attitude towards the consequences of that placement. In point of fact, those consequences determine how we will understand the experimental findings, and may completely remove the significance of the experiment from what was first intended. The other face of the problem, of course, is that the definition of the R-class fixes also the definition of the class of responses not to be reinforced, the 'not-R'.... Because behavior is an unbroken stream, there is in every experiment an embedding...[not-R]...context for R, and the outcome of every experiment depends as much upon what responses are not reinforced as upon those that are."-- Schoenfeld, W. N., & Farmer, J. (1970). The Theory of Reinforcement Schedules, 218-219.
"When small amounts of food are repeatedly given, a 'superstitious ritual' may be set up. This is due not only to the fact that a reinforcing stimulus strengthens any behavior it may happen to follow, even though a contingency has not been explicitly arranged, but also to the fact that the change in behavior resulting from one accidental contingency makes similar accidents more probable."-- Skinner, B. F. (1959). Cumulative Record, 409.
"The experiment might be said to demonstrate a sort of superstition. The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking. There are many analogies in human behavior. Rituals for changing one's luck at cards are good examples. A few accidental connections between a ritual and favorable consequences suffice to set up and maintain the behavior in spite of many nonreinforced instances. The bowler who has released a ball down the alley but continues to behave as if he were controlling it by twisting and turning his arm & shoulder is another case in point. These behaviors have, of course, no real effect upon one's luck or upon a ball halfway down an alley"-- Skinner, B. F. (1948). "Superstition" in the pigeon. Journal of Experimental Psychology ,38 , 168-72.
"Rituals are superstitions; they are adventitiously reinforced. The more conspicuous and stereotyped the behavior upon which the reinforcer is accidentally contingent, the greater the effect."-- Skinner, B. F. (1980), Notebooks, 303-304.
"It does not follow that every conditioned reflex has survival value. The mechanism may go wrong. Certain pairs of stimuli, such as the appearance and taste of food, may occur together in a consistent way which is important to the organism throughout its life, but we have no guarantee that conditioning will not occur when the pairing of stimuli is temporary or accidental. Many 'superstitions' exemplify conditioned responses arising from accidental contingencies. The behavior is due to an actual pairing of stimuli, but the resulting reflex is not useful."-- Skinner, B. F. (1953). Science and Human Behavior, 55.
"The environment is so constructed that certain things tend to happen together. The organism is so constructed that its behavior changes when it comes into contact with such an environment. There are three principal cases. (1) Certain events -- like the color and taste of ripe fruit -- tend to occur together. Respondent conditioning is the corresponding effect upon behavior. (2) Certain activities of the organism effect certain changes in the environment. Operant conditioning is the corresponding effect upon behavior. (3) Certain events are the occasions upon which certain actions effect certain changes in the environment. Operant discrimination is the corresponding effect upon behavior. As a result of these processes, the organism which finds itself in a novel environment eventually comes to behave in a an efficient way."-- Skinner, B. F. (1953). Science and Human Behavior, 125.
"Temporal relation of S 0 and S 1. The required relation of the stimuli is expressed by Pavlov as follows: 'The fundamental requisite is that any external stimulus which is to become the signal in a conditioned reflex must overlap in point of time with the action of an unconditioned stimulus....It is equally necessary that the conditioned stimulus should begin to operate before the unconditioned comes into action' [(64), pp. 26, 27]."-- Skinner, B. F. (1938) Behavior of Organisms, 64.
"We describe the contingency by saying that a stimulus (the light) is the occasion upon which a response (stretching the neck) is followed by reinforcement (with food). We must specify all three terms."--Skinner, B. F. (1953). Science and Human Behavior, 108.