The Statistics of Gene Mapping (Statistics for Biology and Health) Review

The Statistics of Gene Mapping (Statistics for Biology and Health)
Average Reviews:

(More customer reviews)
David Siegmund is a famous probabilist who is both a great lecturer and writer. I personally audited his advanced probability course at Stanford. He coauthored a book on optimal stopping with Herb Robbins and has written other fine books on sequential analysis and repeated significance testing. In recent years he as well as Brad Efron and other Stanford and Berkeley statistics professors has studied the mathematics, probability theory and statistics associated with human genetics and microarray data. This book presents the theory and application of the appropriate probabilistic methods. Anyone with a serious interest in this topic should get the book.
The book assumes some knowledge of probability and statistics. So a novice in the field of statistics could have trouble with the text and require more development. Also for the statistician it may assume a little too much knowledge of genetics. But I think it is the perfect book for the intended audience and makes a great reference.
Another text that is rigorous in terms of statistics and assume less knowledge of statistics and genetics is "Analyzing Microarray Gene Expression Data" by G. J. McLachlan, k.-A. Do and C. Amboise. You will find that I have also reviewed that text on amazon.

Click Here to see more reviews about: The Statistics of Gene Mapping (Statistics for Biology and Health)

This book details the statistical concepts used in gene mapping, first in the experimental context of crosses of inbred lines and then in outbred populations, primarily humans. It presents elementary principles of probability and statistics, which are implemented by computational tools based on the R programming language to simulate genetic experiments and evaluate statistical analyses. Each chapter contains exercises, both theoretical and computational, some routine and others that are more challenging. The R programming language is developed in the text.

Buy Now

Click here for more information about The Statistics of Gene Mapping (Statistics for Biology and Health)

Read More...

Adaptive Filtering: Algorithms and Practical Implementation Review

Adaptive Filtering: Algorithms and Practical Implementation
Average Reviews:

(More customer reviews)
The book by Prof. Diniz is indeed amongst the best on adaptive signal processing. Most of the fundamental concepts are well explained, suitable examples are given and practical applications are also discussed. The chapter on adaptive IIR filters is unique and still cannot be found in any other book. Moreover, solutions to the problems can be obtained by ftp, which is something very useful for students. Despite this is an excellent book (5 star), the price is ridiculous, as occurs with most titles from this publisher. If I were Prof. Diniz I would change from Kluwer Academic Publishers to a more competitive publisher.

Click Here to see more reviews about: Adaptive Filtering: Algorithms and Practical Implementation

This book presents the basic concepts of adaptive signal processing and adaptive filtering in a concise and straightforward manner, using clear notations that facilitate actual implementation. Important algorithms are described in detailed tables which allow the reader to verify learned concepts. The book covers the family of LMS and algorithms as well as set-membership, sub-band, blind, IIR adaptive filtering, and more. Includes a CD supplement for instructors and students, offering lecture transparencies as well as MATLAB codes for all algorithms described in the text. The book is also supported by a web page maintained by the author.

Buy NowGet 20% OFF

Click here for more information about Adaptive Filtering: Algorithms and Practical Implementation

Read More...

Handbook of Parallel Computing and Statistics (Statistics: A Series of Textbooks and Monographs) Review

Handbook of Parallel Computing and Statistics (Statistics:  A Series of Textbooks and Monographs)
Average Reviews:

(More customer reviews)
It came as somewhat of a surprise to the industry that coupling together several PC's enabled the construction of what was in effect a supercomputer at a small fraction of the cost.
What began twenty or so years ago has now influenced the design of CPU's and the intereconnection 'LANs' that facilitate the transfer of data between the processors. And this clearly hasn't stopped. The AMD Opteron CPU's and Intel's PCI-Express are simply the latest innovations in silicon, and more is coming.
From a system architecture standpoint, we have (and the book discusses) clusters, Grids, and distributed processor systems -- all of which are fairly loosely defined with plenty of room for very good discussions over several beer.
What this book brings is an excellent introduction into the state of the art in parallel computers as it exists today. As is often the case with books that are pushing the state of the art, it is written by a large numnber of experts and edited together. Each chapter covers a particular area in depth from the design of the hardware to the languages (primarily Fortran and Java), to the solution of a series of common problems that are frequent in several different application areas.
This book is an excellent summary of parallel computing as it exists today. It would be of particular help to the person responsible for writing the proposal for an organization to buy/build one. The book is probably a bit too advanced for a course at an undergraduate level, but would be excellent for first year graduate students in a wide variety of fields from computer science to bio-informatics, data mining, cryptography or any number of other fields requiring heavy duty computation.

Click Here to see more reviews about: Handbook of Parallel Computing and Statistics (Statistics: A Series of Textbooks and Monographs)

Technological improvements continue to push back the frontier of processor speed in modern computers. Unfortunately, the computational intensity demanded by modern research problems grows even faster. Parallel computing has emerged as the most successful bridge to this computational gap, and many popular solutions have emerged based on its concepts, such as grid computing and massively parallel supercomputers. The Handbook of Parallel Computing and Statistics systematically applies the principles of parallel computing for solving increasingly complex problems in statistics research.
This unique reference weaves together the principles and theoretical models of parallel computing with the design, analysis, and application of algorithms for solving statistical problems. After a brief introduction to parallel computing, the book explores the architecture, programming, and computational aspects of parallel processing. Focus then turns to optimization methods followed by statistical applications. These applications include algorithms for predictive modeling, adaptive design, real-time estimation of higher-order moments and cumulants, data mining, econometrics, and Bayesian computation. Expert contributors summarize recent results and explore new directions in these areas.
Its intricate combination of theory and practical applications makes the Handbook of Parallel Computing and Statistics an ideal companion for helping solve the abundance of computation-intensive statistical problems arising in a variety of fields.

Buy NowGet 14% OFF

Click here for more information about Handbook of Parallel Computing and Statistics (Statistics: A Series of Textbooks and Monographs)

Read More...

Handbook of Mathematical Models in Computer Vision Review

Handbook of Mathematical Models in Computer Vision
Average Reviews:

(More customer reviews)
When attending a general computer vision conference like xCCV, did you ever feel lost at certain sessions? Well, don't always blame the presenters! The field covered by Computer Vision has become so broad that it is almost impossible to understand what is going on and to keep track of the latest developments. To (partially) overcome this problem, the editors of the Handbook of Mathematical Models in Computer Vision have done a great job.
One can become a bit skeptical reading such a title. How complete can such a handbook be? However, going through the 33 chapters, indeed a wide breadth is treated. The focus of the book is on mathematical methods that both model and reproduce human visual abilities. This is the field of biological vision in which the editors have a strong background.
The editors chose three distinct categories of mathematical models, namely variational techniques (those attending Prof. Faugeras' talk at ICPR 2006 may remember his statement that they give the fundamental equations in computer vision!), statistical methods, and combinatorial approaches. The chapters are grouped in six sections that circle around these three categories. Although going through the book chapters by mentioning keyword may yield a rather boring list, it shows the wide variety of topics that are being dealt with.
The book starts with a section on low-level vision: Image Reconstruction. Here one can find information on diffusion filters and wavelets, total variation methods, and PDE based inpainting.
The second section is concerned with Boundary Extraction, Segmentation and Grouping. Here subjects like levelings, graph cuts, minimal paths and fast marching methods, deformable models, variational segmentation with shape priors, curve propagation, level set methods, and a stochastic model of geometric snakes are discussed.
Section three switches to high level vision. It deals with Shape Modeling & Registration, divided into topics concerning invariant processing and occlusion resistant recognition, image-based inferences, point matching and uncertainty-driven, point-based image registration.
In the fourth section, Motion Analysis, Optical Flow & Tracking, the concept of time is added and one encounters the topics of optical flow estimation, image warping, alignment and stitching, visual tracking, image and video segmentation, human motion capture, and dynamic textures.
Section five deals with 3D from Images, Projective Geometry & Stereo Reconstruction, treated by boundary detection, stereo, texture and color, shape from shading, calibration, motion and shape recovery, multi-view reconstruction, binocular stereo with occlusions, and modeling non-rigid dynamic scenes.
The last section may seem a bit odd: Applications: Medical Image Analysis. However, this is one of the most prominent areas in computer vision. Although here certain vision aspects do not occur, compared to natural images (just think of the influence of the sun), for many tasks the performance of the mathematical methods can be evaluated since a ground truth is often available - provided by humans whom the models are supposed to mimic. In this section, applications of interactive graph-based segmentation methods, 3D active shape and appearance models, characterization of diffusion anisotropy, segmentation, variational approaches, and statistical methods of registration are given.
The danger of publishing an edited volume is the difference in style and treatment of the topics among the various contributions. This is not the case here. Each chapter gives a general introduction to the topic, introduces the mathematical model, discusses the underlying ideas globally, and shows some results. For the full details the readers are referred to the extensive bibliography with 929 entries.
This book is a must-have for those interested in the full breadth of research done in the biological & computer vision community. As a bonus, the chapters can also be used in a seminar-based, advanced undergraduate course in mathematical based computer vision.


Click Here to see more reviews about: Handbook of Mathematical Models in Computer Vision



Buy NowGet 26% OFF

Click here for more information about Handbook of Mathematical Models in Computer Vision

Read More...

Model-Driven Design Using Business Patterns Review

Model-Driven Design Using Business Patterns
Average Reviews:

(More customer reviews)
I was accidentally pulled in to the world of REA. I was evaluating the redesign of our major financial application. After deeply thinking the details of the business application dynamics, I started to group them under some basic overly simplified models and entities.
From there, I started to think that there should be somebody out there who faced the same situation and solved the same set of problems with a similar approach and hopefully more elegantly.
Then, I stopped evolving my model and started searching the literature and the Internet. I came across Fowler's book and I think it was great and I liked it so much, especially modelling the account and the relaed entries. But that was about it as far as the simplicity goes. It started to get a bit more complex as I started to get more patterns.
I started to do some more searches till I got to the REA, Resources- Events-Agents and that was it. I was blown away.
The model is so simple but powerful in capturing the most fundamental concepts in the accounting and business domain.
Unfortunately, I did not find enough resources (at this time) that examines the REA and its applications in detail till I found this wonderful book.
I really thank the author for his work.
So I think, REA model will change the business information modelling arena in the same way object oriented programming changed the programming world, and like design patterns impacted the design world.
I also predict that this book will be for the business application architecture community as the GoF book to the software designers community at large.


Click Here to see more reviews about: Model-Driven Design Using Business Patterns

This book shows how to apply pattern ideas in business applications. It presents more than 20 structural and behavioral business patterns that use the REA (resources, events, agents) pattern as a common backbone. The developer working on business frameworks can use the patterns to derive the right abstractions and to design and ensure that the meta-rules are followed by the developers of the actual applications. The application developer can use these patterns to design a business application, to ensure that it does not violate the domain rules, and to adapt the application to changing requirements without the need to change the overall architecture.

Buy NowGet 5% OFF

Click here for more information about Model-Driven Design Using Business Patterns

Read More...

Ecological Models and Data in R Review

Ecological Models and Data in R
Average Reviews:

(More customer reviews)
This book, in part, was developed from Dr. Bolker's graduate course in Ecological Models and Data at the University of Florida. This was the best course I took as a graduate student, it transformed the set of quantitative tools I was able to bring to bear on ecological questions. There was so much worthwhile material covered in this class that I took it twice (UF only counted the first time:). Since graduate school I still frequently refer to my notes from the class. With the publication of "Ecological Models and Data in R" even those who didn't have the good fortune of being in Bolker's class can learn approaches for integrating ecological theory and data. Bolker's book covers much of the material from his course and thus is an excellent resource for graduate students and faculty alike.

Click Here to see more reviews about: Ecological Models and Data in R


Ecological Models and Data in R is the first truly practical introduction to modern statistical methods for ecology. In step-by-step detail, the book teaches ecology graduate students and researchers everything they need to know in order to use maximum likelihood, information-theoretic, and Bayesian techniques to analyze their own data using the programming language R. Drawing on extensive experience teaching these techniques to graduate students in ecology, Benjamin Bolker shows how to choose among and construct statistical models for data, estimate their parameters and confidence limits, and interpret the results. The book also covers statistical frameworks, the philosophy of statistical modeling, and critical mathematical functions and probability distributions. It requires no programming background--only basic calculus and statistics.

Practical, beginner-friendly introduction to modern statistical techniques for ecology using the programming language R
Step-by-step instructions for fitting models to messy, real-world data
Balanced view of different statistical approaches
Wide coverage of techniques--from simple (distribution fitting) to complex (state-space modeling)
Techniques for data manipulation and graphical display
Companion Web site with data and R code for all examples


Buy NowGet 35% OFF

Click here for more information about Ecological Models and Data in R

Read More...

Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing Review

Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing
Average Reviews:

(More customer reviews)
The book starts with a prologue of an under-determined linear system and how sparsity constraints help to solve it with the use of a Langrangian. Next the authors introduce the key idea of how certain norms promote sparsity. There are some good diagrams that really help the geometric intuition (though not as good as the ones by Donoho et al. in connection with Lasso). I really love the way they motivate and frame the entire field but still appeal to concept that most people who have studied linear algebra can relate to.
The first 6 chapters are a master piece in pedagogy. Except for the not so-standard usage of Spark as the measurement of coherence among elements of a dictionary. Mutual coherence is common and easier to grasp since it directly address the size of inner products. This leads to a rather jarring switch when RIP is introduced.
I am still puzzled why the authors do not appeal to frame theory. That leads to strange looking reference to self-dual frames and tight frames when the book never talked about frames.
I also wonder why the authors did not cite Boyd's great book. The treatment of log-barrier was sort of just another penalty function. The term log-barrier was never used in the book.
Overall I cannot put the book down and was especially grateful to the authors for introducing iterative shrinkage as a central theme to link many modern numerical algorithms to solve the basic sparse optimization problem.

Click Here to see more reviews about: Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing

This textbook introduces sparse and redundant representations with a focus on applications in signal and image processing. The theoretical and numerical foundations are tackled before the applications are discussed. Mathematical modeling for signal sources is discussed along with how to use the proper model for tasks such as denoising, restoration, separation, interpolation and extrapolation, compression, sampling, analysis and synthesis, detection, recognition, and more. The presentation is elegant and engaging.Sparse and Redundant Representations is intended for graduate students in applied mathematics and electrical engineering, as well as applied mathematicians, engineers, and researchers who are active in the fields of signal and image processing.

Buy NowGet 18% OFF

Click here for more information about Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing

Read More...

The Algorithmic Beauty of Seaweeds, Sponges and Corals Review

The Algorithmic Beauty of Seaweeds, Sponges and Corals
Average Reviews:

(More customer reviews)
I really like things in the ocean. They are thinking about drilling in the Artic Ocean. Then they are fighting about who should own the Artic Circle.Its really stupid. I wish it would stop.

Click Here to see more reviews about: The Algorithmic Beauty of Seaweeds, Sponges and Corals

This book gives a state-of-the-art overview of modeling growth and form of marine sessile organisms - such as stromatolites, algae, and metazoans including stony corals, hydrocorals, octocorals, and sponges -, using large-scale computing techniques, scientific visualization, methods for analyzing 2D and 3D forms, and particle-based modeling techniques. It originates from the workshop on Modeling Growth and Form of Marine Sessile Organisms, held at the National Center for Ecological Analysis and Synthesis, Santa Barbara, California, August 1999. Experts from various disciplines including developmental biology, ecology, computer science, physics and mathematics, who have research interests in modeling the development of these organisms have been invited to contribute. The book describes all the steps required to develop and experimentally validate morphological models including collecting biological information and methods for specifying and comparing forms. Examples are given of how models are currently being applied to simulate growth and form of marine sessile organisms. Potential applications of growth models and morphological analyses in modern and paleo-bio-monitoring, the detection of environmental change, and the conservation and restoration of marine ecosystems and aquaculture are addressed. The combination of simulation models with laboratory and field experiments provides a powerful tool to obtain insights on how the growth forms of marine organisms emerge from physical, genetic and environmental influences.

Buy NowGet 26% OFF

Click here for more information about The Algorithmic Beauty of Seaweeds, Sponges and Corals

Read More...

Longitudinal Data Analysis (Chapman & Hall/CRC Handbooks of Modern Statistical Methods) Review

Longitudinal Data Analysis (Chapman and Hall/CRC Handbooks of Modern Statistical Methods)
Average Reviews:

(More customer reviews)
First of all, this isn't really a textbook. It's more of an encyclopedia-textbook hybrid. You won't find proofs of theorems or (many) fully worked examples, however you'll get thorough descriptions of methods, their current status and their full history. Each section is written by one to four of the 32 contributors of the text, with each contributor writing on his/her expertise. Contributors include: Fitzmaurice, Verbeke, Molenberghs, Brumback, Carroll, Diggle, Little, Muller, Robins, among others.
If a section is not thorough enough for your liking, the authors refer you to a plethora of relevant papers for just about everything that is stated in the text. This makes it incredibly easy to investigate further into a given topic. Furthermore, the references are fairly current, with some sections including references up to the year 2007.
As I eluded to before, not only does each chapter cover different methodologies for analyzing longitudinal data, but also includes a history of research regarding a method -- what we used before those methods were introduced, why they were lacking, how the current methods address some of those problems, and what directions exist for further research. This gives the reader a picture of how the field has evolved and adapted to problems that longitudinal data analyses presents researchers. Some may find this uninteresting, but to see the historical research process for some of these methods, in my opinion, illustrates how the field has reached the point it is today and more generally, how research progresses in the real world.
The non-parametric/semi-parametric regression methods and missing data sections are especially appealing for I am unaware of any other source which compiles all current research in these areas in one volume.
As I say in the title of my review, this is akin to a 'travel-guide.' For example you won't find a derivation of GEE in this text, but you'll find what they are, why they were introduced, what they replaced, why we bother using them, what properties they have, what extensions researchers have come up with, and what we still don't know...and most importantly, you'll be directed to where you can find the derivations of all of these. Thus it won't be your final stop in researching any methodology for longitudinal data, but it most likely turn into your first.
This text is a great picture of what can be done with this series of texts (Handbooks of Modern Statistical Methods). I look forward to future installments.

Click Here to see more reviews about: Longitudinal Data Analysis (Chapman & Hall/CRC Handbooks of Modern Statistical Methods)

Although many books currently available describe statistical models and methods for analyzing longitudinal data, they do not highlight connections between various research threads in the statistical literature. Responding to this void, Longitudinal Data Analysis provides a clear, comprehensive, and unified overview of state-of-the-art theory and applications. It also focuses on the assorted challenges that arise in analyzing longitudinal data.
After discussing historical aspects, leading researchers explore four broad themes: parametric modeling, nonparametric and semiparametric methods, joint models, and incomplete data. Each of these sections begins with an introductory chapter that provides useful background material and a broad outline to set the stage for subsequent chapters. Rather than focus on a narrowly defined topic, chapters integrate important research discussions from the statistical literature. They seamlessly blend theory with applications and include examples and case studies from various disciplines.
Destined to become a landmark publication in the field, this carefully edited collection emphasizes statistical models and methods likely to endure in the future. Whether involved in the development of statistical methodology or the analysis of longitudinal data, readers will gain new perspectives on the field.

Buy NowGet 33% OFF

Click here for more information about Longitudinal Data Analysis (Chapman & Hall/CRC Handbooks of Modern Statistical Methods)

Read More...

Cryptanalytic Attacks on RSA Review

Cryptanalytic Attacks on RSA
Average Reviews:

(More customer reviews)
Ok, so I don't admit to understand most of this book. The editorial review says that it is geared at professionals at the graduate level. My education is some undergraduate level. But I was able to follow along and learn a lot. Being able to derive and discover such theories and cryptography schemes is another story. These techniques are relatively simple with respect to seeing how the algorithms run, but the idea behind the number theory is profound. Meaning, seeing the algorithm is one thing and manipulating and attacking it is another.
But like I said, with the proper math experience and not knowing the mathematics behind RSA, you will learn a lot. That is why I give this book 5 stars. The reason I read this book is because in the comics and movies, spies are always using secret messages. I want to know exactly what the attacks on RSA are. And even though the material is difficult, I see the attacks and it gives me a mental picture of what is possible. In other words, those things I don't know, I want to know without getting a P.H.D in mathematics. So even if you don't have all the higher mathematical experience, you can learn from this book if you read through it.
I recommend knowledge of linear algebra and working with series. This book is to the point . It is not in traditional textbook format, so it takes a little more work to go through the theories and examples. The reference to other sources in that back of the book is extensive and is referred to often, because although this book is filled with details some subjects need explored more deeply if you plan on researching the certain topic more deeply.
Here is a listing of the attacks covered:
--Direct
integer factorization attacks--discrete logarithm attacks-- quantum factoring and discrete logarithm attacks
--Indirect
common modulus attack--fixed-point attacks--guessing d attacks--knowing Euler's totient function--forward attack--e-th root attack--short e attacks--short d attacks--partial key exposure attacks
--Implementation (side-channel) attacks on d, p, q
timing attacks--power attacks--electromagnetic radiation attacks--random fault (glitch) attacks
This list was taken from page 230.I will also list some areas of interest I marked in the book:--For some odd reason page 5 and 6 are reversed in the book. But there are no other print mistakes.
--Page 58 has Edouard Lucas's 1891 cylindrical cryptography problem that is unsolved.
--Russian mathematician Bouniakowsky discovered clever algorithm 1870 for a^x is an element of b (mod n) with asymptotic complexity. Not much is mentioned here. It is only a paragraph, but it caught my attention while reading.
--On various pages elliptic key cryptography is worked out. The book also list some alternatives to RSA and its variations. --elliptical curve--coding based--lattice based--quantum cryptography
As stated the material is advanced but explained mathematically in a very concise manner. In fact the book is mostly all mathematical steps with paragraphs only used to start or summarize the chapter. What makes this book is that everything that is written is to the point, but is worked out enough to follow along.
I really enjoyed this book. I was however expecting more to be said about Prime numbers. Yes, p and q are Prime, but I was interested in how knowing the Prime numbers would crack RSA. This would fall under the integer factorization problem, but I guess Prime numbers are not supposed to be solvable in polynomial time, so N is supposed to be secure. But I'll admit that much of my interest in this book was the RSA attacks. There is something mysterious and childlike curiosity when cracking a code. Granted it is no easy task and reading this book doesn't make you an expert code cracker overnight. It is step in that direction and introduces the reader to mystery of how messages can be encoded so only the right person sees them, but does it from a math perspective where there is potential to create even more mysteries in the field.
Oh, and another thing. I believe a logarithmic spiral can find a series in Primes if one exists. (vms)

Click Here to see more reviews about: Cryptanalytic Attacks on RSA



Buy Now

Click here for more information about Cryptanalytic Attacks on RSA

Read More...

Artificial Life Models in Software Review

Artificial Life Models in Software
Average Reviews:

(More customer reviews)
Remember Conway's Game of Life? Surely you must, if you are interested in this book. The Game has been around since the 70s. The editors have cultivated recent research papers that demonstrate how far the field has advanced. Reinforced by some pretty colour plates that depict artificial entities [dare we call them living?] in some surroundings. These include the modelling of bee flights through a garden, and simulated trajectories of a group of bacteria.
Nor is the Game of Life ignored. One plate shows it in three dimensions. The Game is played in 2 dimensions, with time as the third dimension. An obvious choice that gives interesting trajectories of the cells.
The narrative adds to the illustrations. By describing a variety of computer simulations [worlds?]. Where the experimenter can tweak many parameters, and watch her world unfold. Some worlds are impressively rich in complexity of observed behaviours.
The only drawback in the book is its skimpy index. A mere two pages. It should have been more detailed.

Click Here to see more reviews about: Artificial Life Models in Software



Buy Now

Click here for more information about Artificial Life Models in Software

Read More...

New Introduction to Multiple Time Series Analysis Review

New Introduction to Multiple Time Series Analysis
Average Reviews:

(More customer reviews)
If you are looking for a book on VARs and cointegration, this is it.
Very clearly written, and with numerical applications of every new concept (so that you can check the accuracy of your codes ...)
Its a significantly improved version of the last edition.
Highly recommended.


Click Here to see more reviews about: New Introduction to Multiple Time Series Analysis

This reference work and graduate-level textbook deals with analyzing and forecasting multiple time series, considering a wide range of models and methods. It is based on the author's successful Introduction to Multiple Time Series Analysis, updated to include the state of the art and latest developments in the field. The book enables readers to perform their analyses in a competent and up-to-date manner, bridging the gap to the difficult technical literature on the topic.

Buy Now

Click here for more information about New Introduction to Multiple Time Series Analysis

Read More...

Simulation: A Modeler's Approach (Wiley Series in Probability and Statistics) Review

Simulation: A Modeler's Approach (Wiley Series in Probability and Statistics)
Average Reviews:

(More customer reviews)
I found Thompson's book a pleasure to read. I was amazed to find out that difficult mathematical problems (such as the Gambler's Ruin problem, or Fokker-Planck's model) can be analyzed much easier when a microaxiomatic format with a simulation flow diagram is derived. Thompson uses a very clear style to explain things. Geometric Brownian processes, resampling-based tests, multivariate procedures and much more interesting stuff are treated in this book. People who are interested in the possibilities of problem-solving by simulation (more than in the mathematical fundaments of simulation) will find this book useful. For sure I am going to use simulation in my research more than I used to do...

Click Here to see more reviews about: Simulation: A Modeler's Approach (Wiley Series in Probability and Statistics)



Buy Now

Click here for more information about Simulation: A Modeler's Approach (Wiley Series in Probability and Statistics)

Read More...

Best Practices for Equity Research Analysts: Essentials for Buy-Side and Sell-Side Analysts Review

Best Practices for Equity Research Analysts:  Essentials for Buy-Side and Sell-Side Analysts
Average Reviews:

(More customer reviews)
My friend Tom Brakke, liked this book and said I would too. He was right, and soon afterward, I heard the author speak at the Baltimore CFA Society. Hearing James Valentine speak is an advantage here. He summarized what is most important, which if you are reading the book, it would be chapter 20 (out of 27). It is his FaVeS framework: Forecast, Valuation, and Sentiment, in that order of importance. Remember that as a key to the book if you read it; it tells you what to focus on as an analyst.
Another key, since the book is long, is to look at the shaded summaries which are usually at the back of each chapter. If stretched for time, read those first, and then read the chapter if you didn't get it.
This book aims to focus analysts on information that matters. Aim for information that makes a difference, and that few others have. Create an information web that maximizes the value of your time, and creates value for your research.
This book covers both the buy-side and the sell-side, telling each how to best use the other side. As a former buy-side analyst, to me it means fewer analyses, and better analyses. Aside from that, it is a game: buy-side: identify the better sell-side analysts and listen to them. Sell-side: identify clients that will generate commissions and market their best insights to them.
Regardless, analysts must identify the few factors that account for 80% of the performance in a given industry, and focus on those intensely. It helps to get into the industry organizations, which can help drive insight into the industry as a whole, and provide a backdrop for questions to ask when talking with executives in the industry.
Learning this will give an analyst a leg up on other analysts. Analysts should also understand the basic accounting structures of their industry so that they can identify companies that are not playing fair -- over-reporting income. I would add don't get negative too quickly. Frauds can develop a momentum of their own. Wait until the fraud gets large relative to the size of the industry before issuing a sell call -- wait for price momentum to go to zero. (Note: for investigative journalists, this does not apply. Jump on early, so that you can say that you warned everyone.)
Basic forensic accounting skills help, as do modeling skills, and basic statistical skills. I was surprised to learn a bunch of Excel shortcuts that I haven't seen elsewhere, and I have used Excel for nineteen years at a high level. The summary of accounting deviations is cogent, as well as pointing readers to Mulford and Schilit.
One idea that I heartily agree with: set up your spreadsheets to differentiate data and formulas. Cells with data series should only contain data. Formulas should have no numbers in them, unless they are trivial. This makes analysis a lot easier and cleaner in the long run.
The book also brings out the need to consider multiple scenarios, which help an analyst to flesh out his analysis. Being willing to consider what can go wrong, or right, richens an analysis. Also, the book warns against common pathologies that overcome analysts, notably -- Confirmation bias, overconfidence, Self-Attribution-bias, Optimism, Recency, Momentum, Heuristics, Familiarity, Snakebite (won't go back to one that hurt you), Falling in love, anxiety, over-reaction, loss-aversion, etc. I have experienced a few of those myself, and would have benefited from thinking these through before becoming an analyst.
Quibbles
I would warn any analyst trying to use simple or multiple regression that they are playing with fire, unless they understand the weaknesses of the data, and the limitations of the general linear model. In twelve-plus years working on Wall Street, I never saw regression used right once.
The author seems to favor DCF over multiples. Truth, neither works well, and one must live with the weaknesses of any approach. DCF embeds a lot of assumptions that are known, though some may be wrong -- multiples embed unknown assumptions.
The author does not like price-to-sales. For industrials and utilities I would say look at a chart of price versus price-to-sales. In most cases, they track, because sales don't vary that much in the short run. If you know the high and low P/S ratios for companies in an industry (P/B for financials) you have valuable information. It gives you boundaries to look at in buy and sell decisions.
I would also warn analysts against using Damodaran and those like him. I don't think his models are wrong so much as impractical. I would rather use a simple model that catches 80-90% of the action, versus one that catches 100% of the action, bet cannot practically be calculated.
Who would benefit from this book:
All equity analysts would benefit from this book. It is detailed, and yet practical. Some of our competitors will benefit from it, and if you don't read it, you will wonder why.

Click Here to see more reviews about: Best Practices for Equity Research Analysts: Essentials for Buy-Side and Sell-Side Analysts



Buy NowGet 45% OFF

Click here for more information about Best Practices for Equity Research Analysts: Essentials for Buy-Side and Sell-Side Analysts

Read More...

Offshore Risk Assessment: Principles, Modelling and Applications of QRA Studies (Springer Series in Reliability Engineering) Review

Offshore Risk Assessment: Principles, Modelling and Applications of QRA Studies (Springer Series in Reliability Engineering)
Average Reviews:

(More customer reviews)
This book has 15 chapters, everything from basic definitions, such as PLL, frequency of impairment etc, to chapters where explosion risk modelling are described quite in detail. On an overall basis, I am sure every pro- can learn something from this book. However, it is not a guideline, outlining a stepwise QRA-approach. It is rather a collection of chapters (sometimes with varying quality) where different aspects of risk analysis are discussed. Chapter 10 (Collision Risk Modelling) gives detailed information about both the historical ship collision incidents and a model about how to calculate the collision energy & consequences. Obviously, the section about historical incidents is a bit outdated and include a few misleading information. Appendix A gives an overview of the softwares in the market. Some of the softwares in the list are not supported anymore. Furthermore, RiskSpectrum is a PSA tool, pretty much tailored for nuc- business. To conclude: Mr. Vinnem and his team put a lot of effort into this book. I think everybody can find something they may like in the book.

Click Here to see more reviews about: Offshore Risk Assessment: Principles, Modelling and Applications of QRA Studies (Springer Series in Reliability Engineering)

Offshore Risk Assessment was the first book to deal with quantified risk assessment (QRA) as applied specifically to offshore installations and operations. This book is a major revision of the first edition. It has been informed by a major R&D programme on offshore risk assessment in Norway (2002-2006). Not only does this book describe the state-of-the-art of QRA, it also identifies weaknesses and areas that need development.

Buy NowGet 21% OFF

Click here for more information about Offshore Risk Assessment: Principles, Modelling and Applications of QRA Studies (Springer Series in Reliability Engineering)

Read More...

Linear and Nonlinear Programming (International Series in Operations Research & Management Science) Review

Linear and Nonlinear Programming (International Series in Operations Research and Management Science)
Average Reviews:

(More customer reviews)
I have profitably used the book to apply constrained minimization procedures in the field of computational contact mechanics. I think it is not a secret that quite often books on mathematics are written from matematicians for matematicians. Hence it is quite hard for engineers both to read and to extract valuable information from them. With this respect this book is a shining star. It presents the topics in a very precise but clear and understandable way. Moreover the notation also is the best compromise between coinciseness and clarity. Matematicians, please, look at this book and follow such style; we engineer desperately need to communicate with you.

Click Here to see more reviews about: Linear and Nonlinear Programming (International Series in Operations Research & Management Science)



Buy NowGet 29% OFF

Click here for more information about Linear and Nonlinear Programming (International Series in Operations Research & Management Science)

Read More...

The Analysis of Structured Securities: Precise Risk Measurement and Capital Allocation Review

The Analysis of Structured Securities: Precise Risk Measurement and Capital Allocation
Average Reviews:

(More customer reviews)
Written for financial engineers, this book nevertheless can also be read profitably by anyone interested in mathematical modeling or mathematical finance. The authors discuss in fair detail the science of structured securities, which are financial products that are becoming more important as investors and financial firms continue to find more intricate ways of dealing with risk. For non-experts (such as this reviewer) in the field of structured finance, the book requires careful reading and attention to detail. Readers are expected to have an understanding of various mathematical topics such as Markov chains, linear algebra, Monte Carlo simulation, and probability and statistics.
As an investment strategy, the authors describe structured securities as performing best in "controlled" environments. This involves the use of `transaction documents', which are used to keep their performance within an expected range, and also `macro-level' controls to assist in dealing with event shocks. The basic idea of a structured security is to assemble a credit or investment package from a variety of sources and allow them to be administered by third parties. This entails that the sources (the transferors) be completely decoupled from the transferee, the latter of which is called a `special purpose entity' (SPE), and which has an extremely low likelihood of becoming insolvent by its own activities. The SPE is an analogue of the obligor, and is also shielded from the consequences of the insolvency of a related party. Its assets are thus `perfected' against the claims of the transferor.
Early in the book the authors describe what they consider to be the two types of structured securities. The first, called the `long-term transaction model' applies to asset-backed, mortgage-backed, and collateralized debt issues with maturity at least one year. The second, called the `short-term transaction model' applies to asset-backed commercial paper markets.
If structured securities are to be used as an investment strategy, their value must be assessed in as fine a detail as possible. This assessment is of course the main goal behind the authors' book, and they therefore spend a fair amount of time in explaining why the usual credit rating strategies are inadequate for structured securities. One of those discussed is `benchmark pool analysis' which does not require a large volume of data and uses a microeconomic model of the obligors in a collateral pool to simulate the financial impact of economic shocks. Others discussed include the actuarial method, used for asset-backed and mortgage-backed transactions, and the default method, which is used for collateralized debt obligations.
The most interesting discussions take place when the authors attempt to formulate a more exact, analytical notion of rating for structured securities than what is available with the usual corporate rating model. Essentially the authors are advocating a "unification" of credit and market risk in structured finance in their attempt to replace the alphanumeric scale of the usual corporate credit rating by a numerical scale (they motivate this interestingly by discussion involving the `continuum hypothesis' from set theory). Most important in their approach is to view the pricing of structured securities as a nonlinear problem: rating and pricing are entangled with each other, in that to obtain the rating the promised yield must be known; but to find the yield, the rating must be known. There is of course a paucity of exact solutions to nonlinear problems, and so numerical techniques must be used. The authors spend a fair amount of time discussing these techniques in the book, and in formulating the problem of structured pools as one involving (Markovian and non-stationary) stochastic processes.
As a warm-up to the complications of asset behavior, the authors first discuss the modeling of liabilities. The collection and distribution of cash to various parties is contained in the `pooling and servicing agreement' (P&S), which is a legally binding document that contains a collection of payment instructions called a `waterfall' or `structure.' A waterfall codifies the payment prioritization taken from the funds that are available. Examples are given that illustrate their analysis.
For those not familiar with Markov chains, the authors give a short review, and argue that they are important to structured finance due to their ability to eliminate long-term static pool data requirements. The Markov chains used in structured finance are finite-state Markov chains, where the states correspond to recognized delinquency states of an issuer in some asset class. The transition matrices of the associated asset pools represent the credit dynamics of structured securities. The authors give three very detailed examples of their formalism, the first one of these, dealing with automobile receivable securitizations, should be familiar to most readers.
The last chapter of the book deals with `triggers', which generalizes the earlier discussion on liability modeling. The authors describe triggers as being the most `intricate' aspect of the analysis of structured securities. If one views them in terms of their physics analogy as control structures, they are fairly straightforward to understand. `Cash flow triggers' which allow a reallocation of cash but it does so without being too disruptive or expensive, are the only types considered in this chapter. The cash reallocation is obtained through the use of a `trigger index', which is usually dependent on transaction variables such as delinquencies or tranche principal balances. A trigger is `breached' if its trigger index is higher than a pre-selected threshold on any determination date.
The authors discuss four basic types of triggers, all of which are defined mathematically in terms of the proportion P(x(t)) of excess spread to be reallocated and some variable function x(t) of the trigger index: `binary', in which all excess cash is reallocated to the spread account when there is a breach at time t; `proportional', which allows a kind of "ramping up" of the triggering; `differential', where the excess spread is proportional to the first derivative of x(t); and `integral', where P(x(t)) is proportional to the integral of x(t) over a time interval with lower bound the breaching time and the upper bound the current time. Monte Carlo simulations are used to optimize trigger mechanisms.

Click Here to see more reviews about: The Analysis of Structured Securities: Precise Risk Measurement and Capital Allocation

The Analysis of Structured Securities presents the first intellectually defensible framework for systematic assessment of the credit quality of structured securities.It begins with a detailed description and critique of methods used to rate asset-backed securities, collateralized debt obligations and asset-backed commercial paper. The book then proposes a single replacement paradigm capable of granular, dynamic results. It offers extensive guidance on using numerical methods in cash flow modeling, as well as a groundbreaking section on trigger optimization. Casework on applying the method to automobile ABS, CDOs-of-ABS and aircraft-lease securitizations is also presented.This book is essential reading for practitioners who seek higher precision, efficiency and control in managing their structured exposures.

Buy NowGet 20% OFF

Click here for more information about The Analysis of Structured Securities: Precise Risk Measurement and Capital Allocation

Read More...