# Problem 1. Read the article cotter1999forward.pdf found in the home...

## Question

Problem 1. Read the article cotter1999forward.pdf found in the homework subfolder. This paper describes three variants of forward stepwise selection.
a. Implement these three variants as functions bmp(x, y), mmp(x, y), and ormp(x, y).
b. Consider the AmesHousing dataset. To do so, install the package with the same name. And then create a clean version of the dataset, named ames, using the command ames = make_ames. The response is Sale_Price. Fit a simple linear model explaining the response as a function of the other variables. Select the variables using each of the
three variants of forward selection that you implemented.

Problem 2. Read Section 3.8 of the textbook Elements of Statistical Learning, 2nd edition.
a. Implement Algorithm 3.4 as a function incremental.stagewise(x, y, eps).
b. Apply that function to the AmesHousing dataset.

Problem 3. Consider the following study of diabetes among Pima Indians. The response is test and we want to build a logistic model predicting it based on a linear combination of the other variables.
a. Apply forward selection with AIC, meaning that the model stops growing the moment
there are no variables that make the AIC decrease.
b. Apply best subset selection with AIC. (You may use the package bestglm.)
c. Apply li-penalized logistic regression, which is for example implemented in the gImnet package. Choose the tuning parameter via cross-validation based on misclassification error. (In all cases, make sure your code returns the selected model.)

## Solution Preview

These solutions may offer step-by-step problem-solving explanations or good writing examples that include modern styles of formatting and construction of bibliographies out of text citations and references. Students may use these solutions for personal skill-building and practice. Unethical use is strictly forbidden.

1. library(quantmod)
library(quantreg)

localLMS <- function(x, y, x_new, h){
fit <- rq(y ~ poly(x, h), tau = 0.5)
return(predict(fit, newdata=data.frame(x=x_new)))
}

# Taking a small subset of the data to show lines clearly.
dat <- dat[1000:1100,]
x <- dat\$Open
y <- dat\$Close
x_new <- dat\$Open

fitted.1 <- localLMS(x, y, x_new...

By purchasing this solution you'll be able to access the following files:
Solution.csv and Solution.zip.

\$75.00
for this solution

PayPal, G Pay, ApplePay, Amazon Pay, and all major credit cards accepted.

### Find A Tutor

View available Statistics-R Programming Tutors

Get College Homework Help.

Are you sure you don't want to upload any files?

Fast tutor response requires as much info as possible.