全部版块 我的主页
论坛 数据科学与人工智能 数据分析与数据科学 R语言论坛
13052 81
2013-10-09
Many of the commonly used methods for modeling and fitting psychophysical data are special cases of statistical procedures of great power and generality, notably the Generalized Linear Model (GLM). This book illustrates how to fit data from a variety of psychophysical paradigms using modern statistical methods and the statistical language R. The paradigms include signal detection theory, psychometric function fitting, classification images and more. In two chapters, recently developed methods for scaling appearance, maximum likelihood difference scaling and maximum likelihood conjoint measurement are examined. The authors also consider the application of mixed-effects models to psychophysical data.

R is an open-source  programming language that is widely used by statisticians and is seeing enormous growth in its application to data in all fields. It is interactive, containing many powerful facilities for optimization, model evaluation, model selection, and graphical display of data. The reader who fits data in R can readily make use of these methods. The researcher who uses R to fit and model his data has access to most recently developed statistical methods.

This book does not assume that the reader is familiar with R, and a little experience with any programming language is all that is needed to appreciate this book. There are large numbers of examples of R in the text and the source code for all examples is available in an R package MPDiR available through R. Kenneth Knoblauch is a researcher in the Department of Integrative Neurosciences in Inserm Unit 846, The Stem Cell and Brain Research Institute and associated with the University Claude Bernard, Lyon 1, in France.

Laurence T. Maloney is Professor of Psychology and Neural Science at New York University. His research focusses on applications of mathematical models to perception, motor control and decision making.

PDF下载回复可见

本帖隐藏的内容

Modeling Psychophysical Data in R.pdf
大小:(5.18 MB)

只需: 1 个论坛币  马上下载




二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

全部回复
2013-10-9 09:45:22
目录:
1 A First Tour Through R by Example  . . . . . . . . . . . . . . . . . 1
1.1 Getting Started   . . . . . . . . . . . . . . . . 1
1.2 The Experiment of Hecht, Shlaer and Pirenne  . . . 2
1.2.1 Accessing the Data Set  . . . . . . . . . . . . . . . . . . . 2
1.2.2 Viewing the Data Set   5
1.2.3 One More Plot Before Modeling  . . . . . . . . 7
1.3 Modeling the Data   . . . . . . . . . . . . 8
1.3.1 Modeling a Subset of the Data  . . . . . . . . . . . 10
1.3.2 Estimating Thresholds and jnd’s  . . . . . . . . . 15
1.3.3 Modeling All of the Data . . . . . . . . . . . . . . . . . 16
1.3.4 Odds and Ends   . . . . . . . 19
1.4 Exercises .   20
2 Modeling in R    . 21
2.1 Introduction .  . . . . . . . . . . . . . . . . . . 21
2.2 The Linear Model   . . . . . . . . . . . . 21
2.2.1 Development of First- and Second-OrderMotion . . . . . . . . . . . 26
2.3 Linear Mixed-EffectsModels  . . . . . . . . . . . . . . . . . . . . 32
2.3.1 The ModelFest Data Set . . . . . . . . . . . . . . . . . . 33
2.4 Nonlinear Models   . . . . . . . . . . . . 44
2.4.1 Chromatic Sensitivity Across the Life Span . . . . . . . . . . . . . . . . . 45
2.5 Additive Models .  . . . . . . . . . . . . . 49
2.5.1 Chromatic Sensitivity, Again  . . . . . . . . . . . . 50
2.6 Generalized Linear Models   . . 53
2.6.1 Chromatic Sensitivity, Yet Again  . . . . . . . . 54
2.7 Which Model to Choose?   . . . . 56
2.8 Exercises .   57
3 Signal Detection Theory   . . . . . . . . . . . 61
3.1 Introduction .  . . . . . . . . . . . . . . . . . . 61
3.2 Sensory and Decision Spaces   61
3.2.1 Optimal Detection   . . . 63
3.3 Case 1: Equal-Variance Gaussian SDT  . . . . . . . . . . 64
3.3.1 The Decision Rule for Equal-Variance Gaussian SDT . . . . . . 66
3.3.2 Simulating and Fitting Equal-Variance Gaussian SDT . . . . . . 66
3.3.3 Maximum Likelihood Estimation of Equal-Variance
Gaussian SDT Parameters . . . . . . . . . . . . . . . . 68
3.3.4 Fisher’s Tea Test   . . . . . 69
3.3.5 Fitting Equal-Variance SDT as a GLM . . 72
3.3.6 Fitting with Multiple Conditions  . . . . . . . . 74
3.3.7 Complete Separation   77
3.3.8 The Receiver Operating Characteristic  . . 79
3.4 Case 2: Unequal-Variance Gaussian SDT  . . . . . . . . 81
3.4.1 The Decision Rule for Unequal-Variance
Gaussian SDT  . . . . . . . . 81
3.4.2 Simulating Unequal-Variance Gaussian SDT . . . . . . . . . . . . . . . . 82
3.4.3 Fitting Unequal-Variance Gaussian SDT
by Direct Optimization . . . . . . . . . . . . . . . . . . . 85
3.5 Case 3: Exponential SDT   . . . . 88
3.6 m-Alternative Forced-Choice and Other Paradigms .. . . . . . . . . . . . . . . . . 89
3.6.1 Letter Identification: 4AFC  . . . . . . . . . . . . . . 90
3.6.2 d’ for Other Paradigms  . . . . . . . . . . . . . . . . . . . 91
3.7 Rating Scales   . . . . . . . . . . . . . . . . . 91
3.7.1 Ordinal Regression Models on Cumulative Probabilities . . . 94
3.7.2 Sensitivity to Family Resemblance  . . . . . . 99
4 The Psychometric Function: Introduction  . . . . . . . . . . 107
4.1 Brief Historical Review   . . . . . . 107
4.2 Fitting Psychometric Functions to Data  . . . . . . . . . . 113
4.2.1 Direct Maximization of Likelihood. . . . . . 114
4.3 A Model Psychophysical Observer . . . . . . . . . . . . . . . 120
4.4 Fitting Psychometric Functions with glm  . . . . . . . 121
4.4.1 Alignment Judgments of Drifting Grating Pairs . . . . . . . . . . . . . 123
4.4.2 Selecting Models and Analysis of Deviance . . . . . . . . . . . . . . . . . 126
4.5 Diagnostic Plots   . . . . . . . . . . . . . . 129
4.6 Complete Separation   . . . . . . . . . 135
4.7 The Fitted Model   . . . . . . . . . . . . . 136
4.8 Summary .   138
5 The Psychometric Function: Continuation  . . . . . . . . . . 141
5.1 Fitting Multiple-Alternative Forced Choice Data Sets. . . . . . . . . . . . . . . . 141
5.1.1 Using an Offset to Set the Lower Asymptote . . . . . . . . . . . . . . . . 142
5.1.2 Using Specialized Link Functions to Set the
Lower Asymptote  . . . . 145
5.1.3 Estimating Upper and Lower Asymptotes . . . . . . . . . . . . . . . . . . . 147
5.2 Staircase Methods   . . . . . . . . . . . . 151
5.3 Extracting Fitted Parameters .  153
5.4 Methods for Standard Errors and Confidence Intervals . . . . . . . . . . . . . . 155
5.5 Bootstrap Standard Error Estimates  . . . . . . . . . . . . . . 157
5.6 Nonparametric Approaches   . 163
6 Classification Images   . . . . . . . . . . . . . . 167
6.1 Ahumada’s Experiment   . . . . . . 167
6.1.1 Simulating a Classification Image Experiment . . . . . . . . . . . . . . 169
6.1.2 The Information Used in Offset Detection . . . . . . . . . . . . . . . . . . 172
6.2 Some History   . . . . . . . . . . . . . . . . . 174
6.3 The Classification Image as a Linear Model  . . . . . 175
6.3.1 Detection of a Gabor Temporal Modulation . . . . . . . . . . . . . . . . . 175
6.3.2 Estimation by a Linear Model  . . . . . . . . . . . 177
6.3.3 Estimation with lm   . . 177
6.3.4 Model Evaluation .  . . . 178
6.4 Classification Image Estimation with glm  . . . . . . . 179
6.5 Fitting a Smooth Model   . . . . . 183
6.5.1 Higher-Order Classification Images  . . . . . 187
6.6 Comparing Methods.  . . . . . . . . . 188
7 Maximum Likelihood Difference Scaling  . . . . . . . . . . . . 195
7.1 Introduction .  . . . . . . . . . . . . . . . . . . 195
7.1.1 Choosing the Quadruples  . . . . . . . . . . . . . . . . 199
7.1.2 Data Format .  . . . . . . . . . 200
7.1.3 Fitting the Model: Direct Maximization . 201
7.1.4 GLM Method   . . . . . . . . 203
7.1.5 Representation as a Ratio Scale  . . . . . . . . . . 205
7.2 Perception of Correlation in Scatterplots  . . . . . . . . 205
7.2.1 Estimating a Difference Scale . . . . . . . . . . . . 207
7.2.2 Fitting a Parametric Difference Scale  . . . 209
7.2.3 Visualizing the Decision Space for Difference Judgments . . 211
7.3 The Method of Triads   . . . . . . . . 214
7.4 The Effect of Number of Trials  . . . . . . . . . . . . . . . . . . . 215
7.5 Bootstrap Standard Errors  . . . . 217
7.6 Robustness and Diagnostic Tests  . . . . . . . . . . . . . . . . . 219
7.6.1 Goodness of Fit   . . . . . . 219
7.6.2 Diagnostic Test of the MeasurementModel. . . . . . . . . . . . . . . . . . 223
8 Maximum Likelihood ConjointMeasurement  . . . . . . 229
8.1 Introduction   . . . . . . . . . . . . . . . . . . 229
8.2 The Model   . . . . . . . . . . . . . . . . . . . . 233
8.3 The Nonparametric Model   . . . 236
8.3.1 Fitting the Nonparametric Model Directly
by Maximum Likelihood . . . . . . . . . . . . . . . . . 240
8.3.2 Fitting the Nonparametric Model with glm . . . . . . . . . . . . . . . . . . 241
8.4 Fitting Parametric Models   . . . 249
8.5 Resampling Methods   . . . . . . . . 252
8.5.1 The Choice of Stimuli  . . . . . . . . . . . . . . . . . . . . 252
8.6 The Axiomatic Model   . . . . . . . 254
9 Mixed-Effects Models   . . . . . . . . . . . . . 257
9.1 Simple Detection   . . . . . . . . . . . . . 259
9.1.1 Random Criterion Effects  . . . . . . . . . . . . . . . . 259
9.1.2 Random Sensitivity Effects  . . . . . . . . . . . . . . 263
9.2 Rating Scale Data  . . . . . . . . . . . . . 264
9.3 Psychometric Functions: Random Intercepts and Slopes. . . . . . . . . . . . . 267
9.3.1 Context Effects on Detection  . . . . . . . . . . . . 276
9.3.2 Multiple-Alternative Forced-Choice Data . . . . . . . . . . . . . . . . . . . . 286
9.4 MLDS    . . . . 287
9.4.1 Regression on the Decision Variable: Fixed
Functional Form   . . . . . 287
9.4.2 Regression on the Decision Variable: Rescaling
to a Common Scale   . . 290
9.4.3 Regression on the Estimated Coefficients . . . . . . . . . . . . . . . . . . . . 295
9.5 Whither from Here   . . . . . . . . . . . 301
A Some Basics of R   . . . . . . . . . . . . . . . . . . 303
A.1 Installing R   . . . . . . . . . . . . . . . . . . . 303
A.2 Basic Data Types   . . . . . . . . . . . . . 304
A.2.1 Vectors   . . . . . . . . . . . . . . . 304
A.2.2 Factors   . . . . . . . . . . . . . . . 309
A.2.3 Lists   . . . . . . . . . . . . . . . . . . 310
A.2.4 Data Frames  . . . . . . . . . . 311
A.2.5 Other Types of Objects  . . . . . . . . . . . . . . . . . . 313
A.3 Simple Programming Constructs  . . . . . . . . . . . . . . . . . 313
A.4 Interacting with the Operating System  . . . . . . . . . . . 313
A.5 Getting Data into and Out of R  . . . . . . . . . . . . . . . . . . . 314
A.5.1 Writing and Reading Text Files  . . . . . . . . . 315
A.5.2 write.table.  . . . . . . . 315
A.5.3 read.table   . . . . . . . . . 315
A.5.4 scan .  . . . . . . . . . . . . . . . . . 317
A.5.5 save and load.  . . . . . . 317
A.5.6 Interacting with Matlab  . . . . . . . . . . . . . . . . . . 318
A.5.7 Interacting with Excel Files  . . . . . . . . . . . . . 318
A.5.8 Reading in Other Formats  . . . . . . . . . . . . . . . . 318
B Statistical Background   . . . . . . . . . . . . 321
B.1 Introduction .  . . . . . . . . . . . . . . . . . . 321
B.2 Random Variables and Monte Carlo Methods  . . . 323
B.2.1 Finite, Discrete Random Variables  . . . . . . 323
B.2.2 Bernoulli Random Variables  . . . . . . . . . . . . . 324
B.2.3 Continuous Random Variables  . . . . . . . . . . . 324
B.2.4 Discrete vs Continuous Random Variables .. . . . . . . . . . . . . . . . . . 325
B.2.5 Normal (Gaussian) Random Variables  . . 326
B.2.6 Location-Scale Families  . . . . . . . . . . . . . . . . . . 327
B.3 Independence and Samples   . . 328
B.3.1 Expectation and Moments . . . . . . . . . . . . . . . . 329
B.3.2 Bernoulli and Binomial Random Variables . . . . . . . . . . . . . . . . . . 331
B.3.3 Uniform Random Variables  . . . . . . . . . . . . . . 331
B.3.4 Exponential and Gamma Random Variables . . . . . . . . . . . . . . . . . 333
B.3.5 The Exponential Family  . . . . . . . . . . . . . . . . . . 333
B.3.6 Monte Carlo Methods  . . . . . . . . . . . . . . . . . . . . 335
B.4 Estimation   . . . . . . . . . . . . . . . . . . . . 335
B.4.1 What Makes an Estimate “Good”?  . . . . . . 336
B.4.2 Maximum Likelihood Estimation . . . . . . . . 337
B.4.3 Why Likelihood? Some Properties  . . . . . . 340
B.4.4 Bayesian Estimation   . 342
B.5 Nested Hypothesis Testing   . . . 347
B.5.1 Reparameterization: Nested Hypothesis Tests . . . . . . . . . . . . . . . 349
B.5.2 The Akaike Information Criterion  . . . . . . . 349
References   . . . . . . . . . . 351
Index    . . . . . . . . . . . . . . . . 361
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2013-10-9 09:49:19
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2013-10-9 09:58:47
THANKS!
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2013-10-9 11:55:08
thanks
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2013-10-9 13:25:53
下载看看
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

点击查看更多内容…
相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群