Posts

CST 370 Week 6

This week in class, I learned about heaps, AVL trees, and 2-3 trees. For each of these trees, I focused on better understanding the algorithms needed to insert values, delete values, and search for values. I found that even when I studied materials provided in class this week, these concepts were more difficult to understand than the concepts from previous weeks. What helped me the most in retaining any of the material covered this week was the programming assignment.  Actually applying what I learned from pseudocode in the text book to creating functions helped better my understanding of how these algorithms work. Since our programming assignment focused on heaps, I have a better understanding of heaps work compared to the other trees discussed. When I have the chance, I will try to implement my own AVL tree and 2-3 tree data structures to help myself in the future.

CST 370 Week 5

This week I developed a better understanding about how quick-sort algorithms can out perform merge-sort algorithms. Previously, I thought that since quick sort's worst case is O(n2) and that merge sort's worst is O(n log n), merge-sort would always be more practical for sorting. I have learned though that the worst input cases for quick-sort, such as an already sorted array or using the min or max element as a pivot, are not common. So by only comparing the results from the average/best cases, both merge-sort and quick-sort both yield O(n log n) time complexities. According to the text book, the efficiency of the innermost loop of the quick-sort algorithm is what allows it to perform more efficiently than merge sort algorithm in most situations. I have learned from additional online resources that other factors such as memory allow quick sorting to occur faster than merge sorting since quick-sort sorts arrays in place. 

CST 370 Week 4

 This week, I continued to learn about the divide and conquer approach to solving problems. The class lessons used merge sorting as example to demonstrate how divide and conquer algorithms can be implemented. An unsorted array is repeatedly split in half into sub arrays until the sub arrays are one element in size. At this point, the sub arrays are considered to be sorted and can be merged back together in a sorted order. This is repeated until all the sorted sub arrays are merged in order. Using the master theorem, I was shown how this sorting method results in n*log n order of growth. In community college, I had to show how many operations had to occur for sorting an array of n size by counting operations at each step of the algorithm. Learning how to apply the master theorem to merge sorting has made understanding time complexities much simpler.

CST 370 Week 3

 My studies this week revolved around different search algorithms. The exhaustive search algorithm for graphs is fairly straightforward as it involves finding solutions by testing all possible combinations of paths. Although simple in concept, this approach can be extremely inefficient since the number of possible paths grows rapidly as the number of nodes in a graph increases. I also learned how to perform a DFS to traverse a graph, which finds the furthest unvisited node before back tracking. This strategy adds nodes to a stack and ensures the deepest nodes are visited first. The material this week also covered performing BFS, which involves exploring the nodes adjacent to the current node first. This strategy uses a queue to ensure the closest nodes are visited first. 

CST 370 Week 2

 During this week of class, I spent much of my time learning about the different ways to describe the growth rate of functions. I found the informal definition in the text book was helpful to me in understanding this topic. Recognizing notations such as big oh, big omega and big theta, as the set of all functions with a smaller, larger, or the same order of growth made this material a little easier to digest. Additionally, I began to read about recursion and how to analyze the time complexity of recursive algorithms. I used a recursive merge sort algorithm in hw2-2 to finish the assignment. The merge sort function divides a large array into 2 halves over and over until it reaches its base case. It then merges the sub arrays into sorted sub arrays over and over until it returns 1 sorted array.

CST 370 Week 1

This week’s reading material provided an introduction to the design and analysis of algorithms. I have taken classes on data structures and algorithms before, so nothing this week seemed entirely new to me. One example problem covered this week that I believe highlights the importance of this class is finding the greatest common divisor between 2 non negative integers. The reading materials from this week covered several ways to accomplish this. The intuitive way to solve this problem involves starting at the smaller of the 2 numbers and checking each descending integer until a greatest common divisor is found. Another way to do this would be to use Euclid's algorithm, which uses recursion to solve this problem. Comparing each approach and how many operations are required to solve this problem highlights the importance of analyzing the time complexity of algorithms.

CST 334 Week 8

  3 important topics I have learned about in this class are virtualization, concurrency, and persistence. Virtualization allows multiple processes or dynamic instances of a running program to run in parallel, even on single processor systems. To accomplish this, the OS virtualizes physical memory, which provides the process its own address space, and controls its use of the CPU through context switching. In this class, I learned about several memory allocation techniques such as paging and segmentation as well as different scheduling algorithms such as using multilevel feedback queues scheduling. Concurrency is essential in the execution of multiple threads, the most basic unit of execution of a process. Issues can occur when multiple threads attempt to enter access and manipulate shared data. Race conditions and inconsistent program results can occur if the order in which threads enter a critical section is not regulated. To ensure atomicity, synchronization primitives such as loc...