Research Papers: Design and Analysis

An Improved Anisotropic Tertiary Creep Damage Formulation

[+] Author and Article Information
Calvin M. Stewart, Ali P. Gordon

Department of Mechanical, Materials, and Aerospace Engineering, University of Central Florida, Orlando, FL 32816-2450

Young Wha Ma

Department of Mechanical Engineering, Chung Ang University, 221 Huksuk Dongjak, Seoul 156-756, Korea

Richard W. Neu

George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332-0405

J. Pressure Vessel Technol 133(5), 051201 (Jul 11, 2011) (10 pages) doi:10.1115/1.4002497 History: Received March 18, 2010; Revised August 13, 2010; Published July 11, 2011; Online July 11, 2011

Directionally solidified (DS) Ni-base superalloys are commonly used as gas turbine materials to primarily extend the operational lives of components under high load and temperature. The nature of DS superalloy grain structure facilitates an elongated grain orientation, which exhibits enhanced impact strength, high temperature creep and fatigue resistance, and improved corrosion resistance compared with off-axis orientations. Of concern to turbine designers are the effects of cyclic fatigue, thermal gradients, and potential stress concentrations when dealing with orientation-dependent materials. When coupled with a creep environment, accurate prediction of crack initiation and propagation becomes highly dependent on the quality of the constitutive damage model implemented. This paper describes the development of an improved anisotropic tertiary creep damage model implemented in a general-purpose finite element analysis software. The creep damage formulation is a tensorial extension of a variation in the Kachanov–Rabotnov isotropic tertiary creep damage formulation. The net/effective stress arises from the use of the Rabotnov second-rank symmetric damage tensor. The Hill anisotropic behavior analogy is used to model secondary creep and tertiary creep damage behaviors. Using available experimental data for a directionally solidified Ni-base superalloy, the improved formulation is found to accurately model intermediate oriented specimen.

Copyright © 2011 by American Society of Mechanical Engineers
Topics: Creep , Stress , Tensors , Anisotropy
Your Session has timed out. Please sign back in to continue.



Grahic Jump Location
Figure 1

Schematics of transversely isotropical material under multiaxial loading

Grahic Jump Location
Figure 2

Schematic of cavity growth on grain boundaries for (a) aluminum and (b) copper

Grahic Jump Location
Figure 3

Equivalence of physical and effective (CDM) space

Grahic Jump Location
Figure 4

Rupture of an (a) L, (b) T, and (c) off-axis oriented specimens

Grahic Jump Location
Figure 5

Damage evolution on the x3 normal of the ISO, ANI, and IM-ANI formulations under 289 MPa uniaxial load and 871°C

Grahic Jump Location
Figure 6

Creep deformation on the x3 normal of the ISO, ANI, and IM-ANI formulations compared with creep test data for DS GTD-111 under 289MPa uniaxial load and 871°C

Grahic Jump Location
Figure 7

Components of the creep deformation for (a) L-, (b) T-, and (c) 45 deg oriented specimen under 289MPa uniaxial load and 871°C (note: primary creep is neglected)

Grahic Jump Location
Figure 8

Parametric material rotation study of creep deformation and damage evolution on the x3 (a) and (b) normal for tensile and (c) and (d) compressive creep conditions at 289MPa and 871°C

Grahic Jump Location
Figure 9

Parametric uniaxial stress rotation study of creep deformation and damage evolution on the x3 normal for an (a) and (b) L-, (c) and (d) 45 deg, and (e) and (f) T-oriented specimen



Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In