0
RESEARCH PAPERS

Stress Intensity Factors for Cracks in Conventional S-N Fatigue Specimens

[+] Author and Article Information
T. P. O’Donnell

O’Donnell Consulting Engineers, 4717 Doverdell Drive, Pittsburgh, PA 15236

J. Pressure Vessel Technol 118(2), 203-207 (May 01, 1996) (5 pages) doi:10.1115/1.2842182 History: Received May 23, 1994; Revised June 29, 1995; Online February 11, 2008

Abstract

Stress intensity values for cracks growing in conventional fatigue specimens are determined, with emphasis on the end constraint conditions associated with S-N fatigue testing. Three-dimensional finite element analysis methods are used to analyze thumbnail-shaped cracks in cylindrical geometries. Crack front straightening due to the increased bending introduced as crack growth progresses is included in the models. Because relatively stiff fatigue test machines prevent rotation at the clamped ends of test specimens, uniform axial displacement boundary conditions are imposed. Results for uniformly applied axial stress end conditions are also obtained for comparison. For crack-depth-to-specimen-diameter ratios over one-third, bending restraint induced in the specimens under applied axial displacement significantly reduces the resulting stress intensity relative to values computed for uniform end tension. The results are useful for evaluating crack growth in fatigue specimens within the limits of linear elastic fracture mechanics.

Copyright © 1996 by The American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In