Here are some codes. To run the fortran code logistic.f, in the command line type f77 logistic.f If there are coding errors, fix them. If not, in the command line type ./a.out Good luck. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% c Program logistic_1.f by Gilmore January 19, 2007 c This program computes 1000 successive iterates c of the logistic map. implicit none integer i real lambda,x,xpr c begin lambda = 4.0 x = 0.6 do i=1,1000 xpr = lambda*x*(1-x) x = xpr end do stop end %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% c Program logistic_2.f by Gilmore January 19, 2007 c This program computes 1000 successive iterates c of the logistic map. It also writes the output to c the computer screen. implicit none integer i,n real lambda,x,xpr parameter(n=1000,lambda = 4.0) c begin x = 0.6 do i=1,n xpr = lambda*x*(1-x) x = xpr write(*,'(i6,f12.6)')i,x end do stop end %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% c Program logistic_3.f by Gilmore January 19, 2007 c This program computes 1000 successive iterates c of the logistic map. It writes the output to c the computer screen and also open a link to c an external file that can be sent into a plotting routine. implicit none integer i,n real lambda,x,xpr parameter(n=1000,lambda = 4.0) c begin x = 0.6 open(11,file='logistic.dat',status='unknown') do i=1,n xpr = lambda*x*(1-x) x = xpr write(*,'(i6,f12.6)')i,x write(11,'(f12.6)')x end do close(11) stop end %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% c Program histogram.f by Gilmore January 23, 2007 c This program computes 10000 successive iterates c of the logistic map. It bins the results into c 100 equally spaced bins and outputs the histogram c to a file. implicit none integer i,k,n,hist(0:100) real lambda,x,xpr parameter(n=1000,lambda = 4.0) c begin do k=0,100 hist(k) = 0 end do x = 0.6 do i=1,n xpr = lambda*x*(1-x) x = xpr k = int(100*x) hist(k) = hist(k)+1 end do open(11,file='histogram.dat',status='unknown') do k=0,99 write(11,'(f12.6,i6)')0.005+0.01*k,hist(k) end do close(11) stop end %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% Maple code to compute the fractal dimension of the Feigenbaum attractor. > alpha:=2.502907875; > f1:=(1/alpha)^x+(1/alpha^2)^x=1; > fsolve(f1,x); 0.524508304 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% This is a C code by Ryan McKeown. It generates data from a Henon map and carries out tests fordeterminism on the data set. I haven't run it (yet) to guarantee that it does what I want it to do. #include #include #include #include #define EPS1 .001 #define EPS2 1.0e-8 double a= 1.4; double b=.3; double get_rand(){ return (rand()/(double)RAND_MAX); } double map1(double x, double z){ return a-x*x+z; } double map2(double x){ return b*x; } //KS-test double qks(double x){ int j; double a2,fac=2.0,sum=0.0,term,termbf=0.0; a2=-2.0*x*x; for(j=1;j<100;j++){ term=fac*exp(a2*j*j); sum+=term; if(fabs(term)<=EPS1*termbf || fabs(term)<=EPS2*sum) return sum; fac=-fac; termbf=fabs(term); } return 1.0; } int main(int argc, char*argv[]) { srand(0); const int RANGE=6; const int NBINS=101; const int NPOINTS=5000; const double XMIN=-RANGE/2.; double x[NPOINTS]; double y[NPOINTS]; int output[NBINS]; for(int i=0;iNPOINTS){ y[i-NPOINTS]=map1(temp,z2); z2=map2(temp); temp=y[i-NPOINTS]; } } for (int j=1;j=(start+(dx*step)) && temp < (start+(dx*(step+1))) ) output[step]++; } } for(int i=0;i0) func=NPOINTS; td=fabs((output[i]-func)/(float)NPOINTS); if(td>D) D=td; } double sig=qks((en+.12+.11/en)*D); fprintf(stderr,"significance value: %f\n",sig); return 0; } %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% Another C code for estimating determinism can be found at Max Polun's website: http://www.pages.drexel.edu/~mlp42/codes/ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%5 c PROGRAM test_det_log.f by Gilmore January 25, 2007 c This is a test for determinism in a time series. c The time series (here generated by a logistic map) is c partitioned into two parts. The first (1 -> half) c is the learning set or data base. The second c (half+1 -> 2 * half) is the test data set. A data point in c the test set is chosen and the point in the learning c set closest in value is located. The next point in the c learning set is used as an estimator of the next value c in the test set. The difference between the two 'next' c values is computed. This difference is binned in a c histogram. If the system is deterministic we should c find a histogram that is essentially a Dirac delta c function, with a little bit of slop. This is not due c to numerical imprecision or roundoff error. It is due c to nonuniform sensitivity to initial conditions across c the data set. IMPLICIT none integer i,j,k,p,n,half,hist(-100:100) integer kk,kkk parameter(n=10001,p=2,half=5000) real*8 x,xpr,eps,lam,test real*8 data(n),dist,oldnxt,next,diff c begin write(*,*) lam = 4.0 !!!!! construct total data set x = 0.4 do i=1,n xpr = lam*x*(1-x) data(i) = xpr x=xpr end do do kkk=-100,100 !!! zero out histogram hist(kkk)=0 end do do i=half+1,2*half !!! scan over test set kk = 0 dist = 1.0 do j=1,half !!! scan over learning set eps = data(j)-data(i) if(abs(eps).lt.dist)then kk=j dist = abs(eps) c write(*,'(i6,f12.6)')kk,dist endif enddo !!! end j loop (learning set) oldnxt = data(kk+1) next = data(i+1) diff = next - oldnxt kkk = int(200+100*diff)-200 c write(*,'(2i6,f12.6)')i,kkk,diff hist(kkk) = hist(kkk)+1 end do !!!!!! end loop (test set) open(11,file='hist.dat',status='unknown') do kkk=-100,100 write(11,'(2i6)')kkk,hist(kkk) end do close(11) 900 write(*,*) stop end %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%