#Computer-Science#DataStructure#CS251

  • An array is a fixed size sequential data structure. You probably already know what it is. This page doesn’t talk about arrays in terms of programming, but talks about them in terms of algorithm analysis
  • Accessing the data within an array is O(1). It is a constant time operation
  • Inserting an element at the end of an array
    • If you keep track of the next available index it is O(1) or constant time
    • If you don’t, then it becomes O(n) due to having to loop through the array to find an open spot
  • Inserting an element into the middle of a sorted array
    • Determining the position to insert and shifting all other elements to the right to make room is O(n), linear time
  • Resizing an array
    • Increasing the size of an array involves creating a new array with a larger size, then transferring the elements from the array to the new array
      • This process is
    • In order to minimize the amount of times we have to repeat the process of expanding the array, we generally double the size of the old array
    • Because this resizing is done so infrequently we amortize this operation.
      • amortize means to average it over several elements. An amortized runtime is more of an average runtime, sometimes it might be more but when you do a lot of them the one time you do more becomes unimportant
      • Despite resizing the array being a costly operation, we do it so infrequently that it doesn’t matter
    • Proof of amortized runtime with array resizing
      • Let T(k) be the runtime cost of resizing the array. The amortized runtime cost would be
      • When we don’t resize the array, since there’s space available. When (the last part of the summation) we have a resize cost of
      • If we solve this summation with those facts in mind we end up with the following: