Difference between revisions of "EGR 103/Spring 2023/Lab 9"

From PrattWiki
Jump to navigation Jump to search
(Created page with "The following document is meant as an outline of what is covered in this assignment. == Review For Lab == Note: Going forward, I will refer to the Sakai / Resources Folder /...")
 
 
Line 1: Line 1:
The following document is meant as an outline of what is covered in this assignment.
 
 
== Review For Lab ==
 
Note: Going forward, I will refer to the Sakai / Resources Folder / ChapraPythonified / Chapter 14 folder as "CP14"
 
* Read the "Part Four" introduction in Chapra, starting on p. 359 in the text and 102/539 on the web page.  In section 4.2, note that we have talked / will talk about Chapters 14-18 but we will skip Chapter 16.
 
* Read the introduction to Chapter 14.
 
* Read 14.1
 
** The Python version of the code in 14.1.3 is in CP14 and named Chapra_14_1_3.py.  You will need the Table14p03.txt data file for it to work.  Notice in particular that Numpy's variance and standard deviation functions require a keyword argument to match MATLAB's default cases.
 
* We have covered the Python versions of 14.2.  Example 14.2 and 14.3 are CP14 and named Chapra_ex_14_2.py and Chapra_ex_14_3.py.
 
* Skim 14.3.  We did the mathematical proof in class.  See [[General_Linear_Regression]] for specific and general versions but do not get too hung up on the math.
 
* Skip or skim 14.4.  Linearization is an important concept but outside of the scope of this lab.  Figure 14.13 would be a good one to look at though as it shows how plotting transformed variables in different ways might lead to an understanding of the underlying mathematical relationship.
 
* Skip 14.5.1.  The <code>linergr</code> is a way-too-specific version of what we will eventually use.
 
* Read 14.5.2.  Note that the differences between Python and MATLAB here will be that we need to set up arrays differently (MATLAB can just use bracketed numbers separated by spaces - Python needs np.array([LIST]) and the np in front of certain commands.
 
* Skip or skim the case study.
 
* Go to [[Python:Fitting]] and specifically:
 
** Read the intro and note that to do the demos you will need a Cantilever.dat file with the contents shown
 
** Take a look at the common command references
 
** Look through the common code; for this lab there will be a special version of it called <code>lab9_common.py</code> which just includes the <code>calc_stats()</code> function.
 
** Take a look at the Polynomial Fitting code and make sure you completely understand each part of it.  The parts with a white background are going to be common to all the demonstrations while the code with a yellow background will be what changes each time.
 
** Take a look at how to make changes to Polynomial models [[Python:Fitting#Polynomial]] and General Linear models [[Python:Fitting#General_Linear]]
 
* '''You can now do Chapra 14.5 completely.'''
 
* Skim 15.1. 
 
* Take a look at the General Linear Regression code at [[Python:Fitting]] and make sure you understand what each line does.  The np.block() command is the key element in producing the $$\Phi$$ matrix needed for the least squares fit method of the linear algebra package in numpy.
 
* '''You can now do Chapra  15.10, 15.10 alt, and 15.12 completely'''
 
<!--
 
* Read 15.2. The key is to do fitting with multiple independent variables, those variables need to be column vectors.  To do graphs, those variables need to be 2D arrays.  See the 2D example in [[Python:Fitting]].
 
* Skim 15.3.  We did this in class; once again, see [[General_Linear_Regression]].
 
* I am working on Pythonifying Chapter 15 - stay tuned!
 
* Take a look at the General Linear Regression code and make sure you understand what each line does.  The np.block() command is the key element in producing the $$\Phi$$ matrix needed for the least squares fit method of the linear algebra package in numpy.
 
* '''You can now do Chapra  15.10, 15.10 alt, 15.6, and 15.7 completely.'''
 
-->
 
<!--
 
14.7, 14.27, 15.5
 
* Skip 15.4. 
 
* Skim 15.5.  Note that it amplifies that the goal is to minimize $$S_r$$.  However, skip the MATLAB code entirely and instead:
 
* Go to [[Python:Fitting]] and specifically:
 
** Read the section on [[Python:Fitting#Nonlinear_Regression]]. Pay very careful attention to the section discussing [[Python:Fitting#Defining_The_Function]] and note that there is a discussion of args and kwargs below.
 
** Take a look at home to make changes to Nonlinear models [[Python:Fitting#Nonlinear]]
 
* '''You can now do Chapra 15.11, 15.22, and 15.29 completely.'''
 
-->
 
  
 
== Submitting Work ==
 
== Submitting Work ==
There are Connect and Lab Assignment parts for almost every problem.  The Connect parts and lab uploads are due the same day.
+
There are Connect and Lab Assignment parts for almost every problem.  The Connect parts and lab uploads are due the same day, but you will want to get the work done far earlier than that to have time to put together your own lab report.
* You can work in small groups to create the programs.  Once the programs are done, you need to work individually on making the LaTeX document.   
+
* You can work in small groups to create the programs.  Once the '''''programs''''' are done, you need to work '''''individually''''' on making the LaTeX document.   
* Be sure to carefully read each problem - sometimes Connect will change a number or a prompt slightly from the book problem.   
+
* Be sure to carefully read each problem - sometimes Connect will change a number or a prompt slightly from the book problem.  '''Your PDF version should use the original values in the book problem''' so be sure to change them if needed after making calculations for Connect.
 
* Once you get the Connect assignment 100% correct, you will be able to look at the assignments and the explanations for the answers.  '''Note:''' if there is coding involved in an answer, the solution on Connect will be presented as MATLAB code; take a look to see the similarities and differences with Python.
 
* Once you get the Connect assignment 100% correct, you will be able to look at the assignments and the explanations for the answers.  '''Note:''' if there is coding involved in an answer, the solution on Connect will be presented as MATLAB code; take a look to see the similarities and differences with Python.
 
** Use <code>fig.set_size_inches(6, 4, forward=True)}</code> to make your graphs all the same size.
 
** Use <code>fig.set_size_inches(6, 4, forward=True)}</code> to make your graphs all the same size.
** Be sure to use tight layout and save as a .png (graphics) file, not a .eps file.   
+
** Be sure to use tight layout and save the graph as a .png (graphics) file, not a .eps file.   
  
<!--
 
== Additional References for args and kwargs ==
 
The functions used to solve nonlinear regression require that certain arguments be passed in a very specific way.  You will need to understand how *args works for this part.  The information below covers both *args and **kwargs.  Basically, *args and **kwargs will capture additional parameters in a function call.  The *args are unnamed and captured in a tuple while the **kwargs are named and captured in a dictionary.  Note that once there is a named parameter in a function call, every parameter after that must also be named!  Here are some references:
 
* [https://realpython.com/python-kwargs-and-args/ Python args and kwargs: Demystified] at Real Python
 
* The *args and **kwargs tutorial:
 
<html>
 
<iframe src="https://trinket.io/embed/python3/2ff0898517" width="100%" height="600" frameborder="0" marginwidth="0" marginheight="0" allowfullscreen></iframe>
 
</html>
 
-->
 
  
 
== Typographical Errors ==
 
== Typographical Errors ==
Line 78: Line 29:
 
* See [[Python:Fitting#General_Linear_Regression]]
 
* See [[Python:Fitting#General_Linear_Regression]]
  
 +
=== Quick note on nonlinear regression ===
 +
You will be using nonlinear regression for the next two problems.  The main Python method you will be using is:
 +
* [https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.curve_fit.html scipy.optimize.curve_fit]
 +
 +
Note that generally we will bring in <code>scipy.optimize</code> with
 +
import scipy.optimize as opt
 +
so the function calls will look like
 +
opt.curve_fit()
 +
 +
In the documentation on Scipy, they bring in the entire optimize package with
 +
from scipy import optimize
 +
so their function calls look like
 +
optimize.curve_fit()
  
 
===Chapra 15.11===
 
===Chapra 15.11===

Latest revision as of 20:19, 22 March 2023

Submitting Work

There are Connect and Lab Assignment parts for almost every problem. The Connect parts and lab uploads are due the same day, but you will want to get the work done far earlier than that to have time to put together your own lab report.

  • You can work in small groups to create the programs. Once the programs are done, you need to work individually on making the LaTeX document.
  • Be sure to carefully read each problem - sometimes Connect will change a number or a prompt slightly from the book problem. Your PDF version should use the original values in the book problem so be sure to change them if needed after making calculations for Connect.
  • Once you get the Connect assignment 100% correct, you will be able to look at the assignments and the explanations for the answers. Note: if there is coding involved in an answer, the solution on Connect will be presented as MATLAB code; take a look to see the similarities and differences with Python.
    • Use fig.set_size_inches(6, 4, forward=True)} to make your graphs all the same size.
    • Be sure to use tight layout and save the graph as a .png (graphics) file, not a .eps file.


Typographical Errors

None yet!

Specific Problems

  • Be sure to put the appropriate version of the honor code -- if you use the examples from Pundit, the original author is either DukeEgr93 or Michael R. Gustafson II depending on how you want to cite things.

Chapra 14.5

Chapra 15.10

Chapra 15.10 Alternate

Chapra 15.12

Quick note on nonlinear regression

You will be using nonlinear regression for the next two problems. The main Python method you will be using is:

Note that generally we will bring in scipy.optimize with

import scipy.optimize as opt

so the function calls will look like

opt.curve_fit()

In the documentation on Scipy, they bring in the entire optimize package with

from scipy import optimize

so their function calls look like

optimize.curve_fit()

Chapra 15.11

  • See Python:Fitting#Nonlinear_Regression
  • For the initial guesses, make sure you understand the subscripts for the parameters and then figure out how to approximate their values from the information provided in the problem.

Chapra 15.22



General Concepts

General Linear Regression