eCommons Collection:http://hdl.handle.net/1813/472015-07-05T04:10:39Z2015-07-05T04:10:39ZCultural Resilience and Identity in Contemporary Death Rituals of the Chinese Hoa in Ho Chi Minh CityDuong, Tra Huong Thihttp://hdl.handle.net/1813/403292015-07-02T05:01:39Z2015-05-24T00:00:00ZTitle: Cultural Resilience and Identity in Contemporary Death Rituals of the Chinese Hoa in Ho Chi Minh City
Authors: Duong, Tra Huong Thi
Abstract: In this thesis, I have combined a historical analysis of traditional Chinese death rituals in China with an ethnographic record of contemporary death rituals practiced in the Chinese Hoa community in Ho Chi Minh City. At its core, this thesis is a study of Chinese Hoa cultural resilience, adaptation and the use of death rituals as a process of "reinscription" of Chinese Hoa cultural identity through the maintenance of traditional customs and practices. During my research it became evident from the traditional and contemporary rituals (analyzed here) that contemporary death rituals are in many ways more complex, albeit generally less onerous, than their traditional predecessors. Rather than adopting Vietnamese traditions and incorporating them into their own death rituals, the Chinese Hoa people in Ho Chi Minh City have instead modified and re-invented old rituals and situated them within a modern context as a means of maintaining their ethnic identity.2015-05-24T00:00:00ZBernie Sanders: The Working Classes' CandidateHill, Catherine Alisonhttp://hdl.handle.net/1813/401412015-05-12T05:02:15Z1989-05-01T00:00:00ZTitle: Bernie Sanders: The Working Classes' Candidate
Authors: Hill, Catherine Alison
Abstract: This thesis is the story of Bernie Sanders, the Socialist Mayor of Burlington and his campaign for Governor of Vermont in 1986. The campaign is used as a prism to explore his version of socialist politics and policies within a capitalist state. The policies which Sanders developed in this campaign for lowering property taxes for middle and lower income people, increasing social spending, increasing citizen participation and raising the taxes for wealthy people and corporations are examined in detail. Sanders claims that city governments can work for poor and working class people, however this thesis demonstrates the difficulties leftists have in getting elected and in implementing policies whenever they do win. In conclusion, I examine the questions about left participation in the electoral process, the autonomy of the state, and what socialist municipal and state policies should be.1989-05-01T00:00:00ZRepresentation Of Uncertainty And Corridor Dp For Hydropower OptimizationLamontagne, Jonathanhttp://hdl.handle.net/1813/394822015-06-01T16:05:42Z2015-01-26T00:00:00ZTitle: Representation Of Uncertainty And Corridor Dp For Hydropower Optimization
Authors: Lamontagne, Jonathan
Abstract: This thesis focuses on optimization techniques for multi-reservoir hydropower systems operation, with a particular concern with the representation and impact of uncertainty. The thesis reports on three research investigations: 1) examination of the impact of uncertainty representations, 2) efficient solution methods for multi-reservoir stochastic dynamic programming (SDP) models, and 3) diagnostic analyses for hydropower system operation. The first investigation explores the value of sophistication in the representation of forecast and inflow uncertainty in stochastic hydropower optimization models using a sampling SDP (SSDP) model framework. SSDP models with different uncertainty representation ranging in sophistication from simple deterministic to complex dynamic stochastic models are employed when optimize a single reservoir systems [similar to Faber and Stedinger, 2001]. The effect of uncertainty representation on simulated system performance is examined with varying storage and powerhouse capacity, and with random or mean energy prices. In many cases very simple uncertainty models perform as well as more complex ones, but not always. The second investigation develops a new and efficient algorithm for solving multi-reservoir SDP models: Corridor SDP. Rather than employing a uniform grid across the entire state space, Corridor SDP efficiently concentrates points in where the system is likely to visit, as determined by historical operations or simulation. Radial basis functions (RBFs) are used for interpolation. A greedy algorithm places points where they are needed to achieve a good approximation. In a four-reservoir test case, Corridor DP achieves the same accuracy as spline-DP and linear-DP with approximately 1/10 and 1/1100 the number of discrete points, respectively. When local curvature is more pronounced (due to minimum-flow constraints), Corridor DP achieves the same accuracy as spline-DP and linear-DP with approximately 1/30 and 1/215 the number of points, respectively. The third investigation explores three diagnostic approaches for analyzing hydropower system operation. First, several simple diagnostic statistics describe reservoir volume and powerhouse capacity in units of time, allowing scale-invariant comparisons and classification of different reservoir systems and their operation. Second, a regression analysis using optimal storage/release sequences identifies the most useful hydrologic state variables . Finally spectral density estimation identifies critical time scales for operation for several single-reservoir systems considering mean and random energy prices. Deregulation of energy markets has made optimization of hydropower operations an active concern. Another development is publication of Extended Streamflow Forecasts (ESP) by the National Weather Service (NWS) and others to describe flow forecasts and their precision; the multivariate Sampling SDP models employed here are appropriately structured to incorporate such information in operational hydropower decisions. This research contributes to our ability to structure and build effective hydropower optimization models.2015-01-26T00:00:00ZThe Use Of Nanoparticles To Assess Subsurface Flow HeterogeneityZhao, Yushihttp://hdl.handle.net/1813/394812015-06-01T16:04:16Z2015-01-26T00:00:00ZTitle: The Use Of Nanoparticles To Assess Subsurface Flow Heterogeneity
Authors: Zhao, Yushi
Abstract: Understanding subsurface flow condition is difficult, but very important. Tracer tests have been done as a diagnostic tool to assess the subsurface fluid flow conditions. However conventional ionic tracers are very diffusive, thus during a prolonged field test, the resolution of the breakthrough curve are usually tempered by their rapid rate of diffusion. Inert nanoparticle tracers are much larger than ionic tracers, and not very diffusive. Laboratory scaled dual nanoparticle and chemical tracer experiments in both aqueous and CO2 based systems are demonstrated in this dissertation, as well as a field test in a "single crack" sub-horizontal bedrock fracture system. These tests demonstrated the CDot nanoparticles behave inertly both in the laboratory and in the natural groundwater conditions. Differential arrival patterns between inert nanoparticle tracers and ionic tracers suggest that the particle tracers give higher resolution breakthroughs. Moreover, in the field test, channelized flow is hinted by the erratic arrival of inert particle tracer, which is further confirmed by the largely retarded arrival of surface area dependent sorbing ionic trace injected simultaneously.2015-01-26T00:00:00ZApplications Of Multi-Objective, Mixed-Integer And Hybrid Global Optimization Algorithms For Computationally Expensive Groundwater ProblemsWan, Yinghttp://hdl.handle.net/1813/394802015-06-01T16:08:46Z2015-01-26T00:00:00ZTitle: Applications Of Multi-Objective, Mixed-Integer And Hybrid Global Optimization Algorithms For Computationally Expensive Groundwater Problems
Authors: Wan, Ying
Abstract: This research focuses on the development and implementation of e cient optimization algorithms that can solve a range of computationally expensive groundwater simulationoptimization problems. Because groundwater model evaluations are expensive, it is important to find accurate solutions with relatively few function evaluations. As a result, all the algorithms tested in this research are evaluated on a limited computation budget. The first contribution to the thesis is a comparative evaluation of a novel multi-objective optimization algorithm, GOMORS, to three other popular multi-objective optimization methods on applications to groundwater management problems within a limited number of objective function evaluations. GOMORS involves surrogate modeling via Radial Basis Function approximation and evolutionary strategies. The primary aim of the analysis is to assess the e↵ectiveness of multi-objective algorithms in groundwater remediation management through multi-objective optimization within a limited evaluation budget. Three sets of dual objectives are evaluated. The objectives include minimization of cost, pollution mass remaining/pollution concentration, and cleanup time. Our results indicate that the overall performance of GOMORS is better than three other algorithms, AMALGAM, BORG and NSGA-II, in identifying good trade-o↵ solutions. Furthermore, GOMORS incorporates modest parallelization to make it even more e cient. The next contribution is application of SO-MI, a surrogate model-based algorithm designed for computationally expensive nonlinear and multimodal mixed-integer black-box optimization problems, to solve groundwater remediation design problems (NL-MIP). SO-MI utilizes surrogate models to guide the search thus save the expensive function evaluation budget, and is able to find accurate solutions with relatively few function evaluations. We present numerical results to show the e↵ectiveness and e ciency of SO-MI in comparison to Genetic Algorithm and NOMAD, which are two popular mixed-integer optimization algorithms. The results indicate that SO-MI is statistically better than GA and NOMAD in both study cases. Chapter 4 describes DYCORS-PEST, a novel method developed for high dimensional, computationally expensive, multimodal calibration problems when the computation budget is limited. This method integrates a local optimizer PEST into a global optimization framework DYCORS. The novelty of DYCORS-PEST is that it uses a memetic approach to improve the accuracy of the solution in which DYCORS selects the point at which the search switches to use of the local method PEST and when it switches back to the global phase. Since PEST is a very e cient and widely used local search algorithm for groundwater model calibration, incorporating PEST into DYCORS-PEST is a good enhancement for PEST and easy for PEST users to learn. DYCORS-PEST achieves the goal of solving the computationally expensive black-box problem by forming a response surface of the expensive function, thus reducing the number of required expensive function evaluations for finding accurate solutions. The key feature of the global search method in DYCORS-PEST is that the number of decision variables being perturbed is dynamically adjusted in each iteration in order to be more e↵ective for higher dimensional problems. Application of DYCORS-PEST to two 28parameter groundwater calibration problems indicate this new method outperforms PEST by a large margin for high dimensional, computationally expensive, groundwater calibration problems.2015-01-26T00:00:00ZThree Essays On Dynamic Stochastic General Equilibrium Models With Heterogeneous Agents And Financial FrictionsZhao, Tianlihttp://hdl.handle.net/1813/394792015-06-01T16:07:05Z2014-08-18T00:00:00ZTitle: Three Essays On Dynamic Stochastic General Equilibrium Models With Heterogeneous Agents And Financial Frictions
Authors: Zhao, Tianli
Abstract: The thesis consists of three essays. The first essay develops a two-country heterogeneous-agents model with equilibrium default to explore the impact of financial integration between emerging countries and the U.S. The model shows that inefficient credit monitoring in emerging countries makes the borrowers in these countries more prone to default. The higher default risk makes financial intermediation in emerging countries less efficient (e.g. higher interest rate spread, higher default rate and lower borrowing capacity). Thus, households in emerging countries rely more on their own savings to hedge against future uncertainty. As a result, these countries have higher saving rate and lower saving return than the U.S. Given this logic, once funds are allowed to move across borders, money will move from emerging countries to the U.S seeking higher return. Thus, in the long run, the U.S gradually accumulates foreign liability along with depressed interest rate and relaxed credit limit. Meanwhile, the wealth inequality of the U.S gradually increases, whereas the consumption inequality in the U.S is mitigated due to the expanded consumer credit. The results are opposite for emerging countries. The second essay uses the modeling framework developed in the essay One to draw important policy lessons pertaining to how an emerging country should liberalize its capital account from an initial state of financial autarky. The model shows that, due to the inefficient financial intermediation, financial opening up by emerging countries may trigger a capital outflow in the short run. The sudden capital outflow raises the interest rate and crowds out domestic credit in emerging countries, and therefore a fraction of households in these countries become financially distressed, potentially leading to a liquidity crisis. The paper then shows that financial integration has different welfare impacts across households. For example, in emerging economies, rich households benefit from the financial integration but poor suffer. Gradual change in financial openness mitigates these differences leading to a higher overall welfare. Accordingly, the paper argues for a more gradual approach to capital account opening for emerging countries. The third essay explores the linkage between financial disruptions and business cycles by studying the full equilibrium dynamics of an economy with two regimes, "normal business cycles" and "financial disruptions". The system behaves differently across the two regimes. During normal cycles, the economy is fluctuating around the center of the stochastic steady state where agents are able to maintain optimal capital stock through collateral borrowing. During the episodes of financial disruptions, the productive agents are financially constrained and the economy may deviate from its efficient state, followed by a sharp decline in output and capital price as well as a joint increase in risk premium and the Sharpe ratio. The basic mechanism of the model is the following: since the return on capital is higher if it is owned by high-productivity agents, in equilibrium high-productivity agents accumulate capital stocks through leverage. Due to the debt enforcement problem, there is a maximum level of leverage determined by the financial market, which depends on the market's projection of the future value of collateral. The equilibrium leverage of high-productivity agents occasionally hits the endogenous maximum level, in which case financial disruptions occur. Because of the precautionary motive, there is only a low probability that the leverage constraint binds, while the absence of constraint characterizes the economy most of the time. Therefore, the likelihood of financial disruption depends on the history of macro shocks and individual actions that affect the equilibrium leverage. In other words, financial disruptions are endogenous rare episodes evolved over business cycles.2014-08-18T00:00:00ZEmissions Impacts Of Dynamic PricingValentine, Oliviahttp://hdl.handle.net/1813/394752015-05-08T15:28:31Z2015-01-26T00:00:00ZTitle: Emissions Impacts Of Dynamic Pricing
Authors: Valentine, Olivia
Abstract: Dynamic pricing is a trendy term that can be found in a variety of industries. In the utilities industry, the implementation of dynamic pricing structure is an economic stimulus to encourage demand reduction of electricity usage in peak hours, when the power system is strained and the cost of electric power is very high. This study investigated the rate structure of day-ahead hourly pricing programs in New York State, and evaluated the demand and emissions impacts of dynamic pricing programs in the summer of 2008. Different scenarios of dynamic pricing programs are modeled to evaluate the demand and emissions change for NOx and SO2 emissions in peak hours, as well as in off-peak hours. Three methods are proposed to evaluate NOx emission reduction in New York State. Hourly emissions changes from power production in the NPCC power system model are scaled to emissions in the National Emissions Inventory (NEI), in order to simulate potential emissions changes in historical days caused by dynamic pricing. The NEI and the simulated emissions are used as point source emissions input into Sparse Matrix Operator Kernel Emissions (SMOKE) modeling system. The processed emissions change from SMOKE is visualized using Visualization Environment for Rich Data Interpretation (VERDI). Results show that dynamic pricing programs can result in considerable emissions reduction in peak hours, while inducing a slight increase in off-peak hours. The i emissions reduction will have non-negligible environmental and social impacts for the New York State, especially for the metropolitan areas like New York City. ii2015-01-26T00:00:00ZAn Empirical Performance Evaluation Of Different Portfolio Allocation StrategiesHu, Xiaohttp://hdl.handle.net/1813/394762015-05-08T15:28:31Z2015-01-26T00:00:00ZTitle: An Empirical Performance Evaluation Of Different Portfolio Allocation Strategies
Authors: Hu, Xiao
Abstract: Incorporating Value Averaging portfolio construction method with S&P 500 firms' Aggregate Implied Cost of Capital is an investment strategy that involves undertaking risks during market recessions and recovering strongly in post-recession periods. This strategy outperforms a pure Value Averaging strategy, Dollar Cost Averaging, and Strategic Asset Allocation under different asset class weights under the performance metrics of Internal Rate of Return, Sharpe Ratio, and Maximum Drawdown Ratio. When applying different risk-free borrowing caps, Value Averaging incorporated with Aggregate Implied Cost of Capital results in lower risks. However, it will not yield better returns unless maximum risk-free borrowing caps are relaxed. The strategy also requires a longer portfolio horizon to ensure higher Internal Rate or Return.2015-01-26T00:00:00ZFiniteness Properties And Piecewise Projective HomeomorphismsLodha, Yashhttp://hdl.handle.net/1813/394772015-05-08T15:28:31Z2015-01-26T00:00:00ZTitle: Finiteness Properties And Piecewise Projective Homeomorphisms
Authors: Lodha, Yash
Abstract: In 1929 the mathematician and physicist John von Neumann isolated an analytic property of groups from the Banach-Tarski paradox. This property is now known as amenability. He observed that groups which contain the free group of rank 2, or F2 , are nonamenable and asked whether all nonamenable groups contain F2 . This was disproved by Ol'shankii [27] and independently by Adyan [2][3] in 1979. The first finitely presented counterexample was constructed in 2003 by Olshanskii and Sapir [28]. In [26] Monod introduced a new family of counterexamples. However, Monod's examples are not finitely presentable (or even finitely generated). In this thesis we will isolate a finitely presentable nonamenable subgroup of Monod's group H(R). (This is joint work with Justin Moore [24].) The group is generated by a(t) = t + 1 together with: b(t) = t t 1[-]t 3 [-] 1 t t + 1 if t [LESS-THAN OR EQUAL TO] 0 if 0 [LESS-THAN OR EQUAL TO] t [LESS-THAN OR EQUAL TO] if 1 2 1 2 [LESS-THAN OR EQUAL TO]t[LESS-THAN OR EQUAL TO]1 c(t) = 2t 1+t t if 0 [LESS-THAN OR EQUAL TO] t [LESS-THAN OR EQUAL TO] 1 otherwise if 1 [LESS-THAN OR EQUAL TO] t An isomorphism between the group a, b and Thompson's group F was discovered by Thurston in the late 1970's by considering the reals as continued fractions. We extend Thurston's work to a, b, c , and use this as a combinatorial framework to show the following: Theorem. The group a, b, c is finitely presented with 3 generators and 9 relations. We also obtain a natural infinite presentation, a normal form, and an algorithm for converting a given word to a word in normal form using the relations. Further, we show the following: Theorem. The group a, b, c acts on a connected cell complex X by cell permuting homeomorphisms such that the following holds. 1. X is contractible. 2. The quotient X/ a, b, c has finitely many cells in each dimension. 3. The stabilizers of each cell are of type F[INFINITY] . It follows that a, b, c is of type F[INFINITY] . This provides the first example of a group that is of type F[INFINITY] , nonamenable and does not contain F2 . Monod's technique for proving nonamenability of his groups relies on the nonamenability of the associated orbit equivalence relations. In forthcoming work we characterize all piecewise projective groups of homeomorphisms that are isomorphic to F and study the associated orbit equivalence relations. This thesis also contains the following unrelated construction which gives a new illustration of a phenomenon first exhibited by Brady [6]. Theorem. Let [GAMMA] be the complete bipartite graph with 22 vertices in each set of the partition. There is a subcomplex X of the product [GAMMA] x [GAMMA] x [GAMMA] and a map [chi] : X [RIGHTWARDS ARROW] S1 such that [pi]1 (X) is hyperbolic and the kernel H = Ker([chi]* : [pi]1 (X) [RIGHTWARDS ARROW] Z) is finitely presented but not of type F P3 . In particular, H is not hyperbolic.2015-01-26T00:00:00ZOptimizing Response Time For Distributed Applications In Public CloudsZou, Taohttp://hdl.handle.net/1813/394732015-05-08T15:28:31Z2015-01-26T00:00:00ZTitle: Optimizing Response Time For Distributed Applications In Public Clouds
Authors: Zou, Tao
Abstract: An increasing number of distributed data-driven applications are moving into public clouds. By sharing resources and operating at large scale, public clouds promise higher utilization and lower costs than private clusters. Also, flexible resource allocation and billing methods offered by public clouds enable tenants to control response time or time-to-solution of their applications. To achieve high utilization, however, cloud providers inevitably place virtual machine instances non-contiguously, i.e., instances of a given application may end up in physically distant machines in the cloud. This allocation strategy leads to significant heterogeneity in average network latency between instances. Also, virtualization and the shared use of network resources between tenants results in network latency jitter. We observe that network latency heterogeneity and jitter in the cloud can greatly increase the time required for communication in these distributed data-driven applications, which leads to significantly worse response time. To improve response time under latency jitter, we propose a general parallel framework which exposes a high-level, data-centric programming model. We design a jitter-tolerant runtime that exploits this programming model to absorb latency spikes transparently by (1) carefully scheduling computation and (2) replicating data and computation. To improve response time with heterogeneous mean latency, we present ClouDiA, a general deployment advisor that selects application node deployments minimizing either (1) the largest latency between application nodes, or (2) the longest critical path among all application nodes. We also describe how to effectively control response time for interactive data analytics in public clouds. We introduce Smart, the first elastic cloud resource manager for in-memory interactive data analytics. Smart enables control of the speed of queries by letting users specify the number of compute units per GB of data processed, and quickly reacts to speed changes by adjusting the amount of resources allocated to the user. We then describe SmartShare, an extension of Smart that can serve multiple data scientists simultaneously to obtain additional cost savings without sacrificing query performance guarantees. Taking advantage of the workload characteristics of interactive data analysis, such as think time and overlap between datasets, we are able to further improve resource utilization and reduce cost.2015-01-26T00:00:00Z