Wednesday, December 25, 2019
The Executive And Founder Of Growing Places - 1344 Words
Background Information History Evan Breyer is the chairman and founder of Growing Places. Growing Places offered on-site childcare and preschool classes for 60 companies in the mid-western U.S. At the early stage of the company, the revenue was not notable. ââ¬Å"Each quarter seemed as though it might be the last,â⬠Evan said. Therefore, they hired Rob Miranda, the CEO of Growing Places. Development Rob Miranda, a little restless, had entrepreneur mind set. He brought some important innovations to the company. His entrepreneurial vision came with an abrasive personality and some disregard for social convention. However, he was not emotional intelligence. His communication style was difficult to be accepted by others. The board of Growing Places came up with an idea of providing scholarships for kids whose families demonstrated financial need, and a corporate sponsor would be a good way to pay for the scholarships. The company came to Thrivand, which was a maker of infant formulas, cereals, and beginnersââ¬â¢ foods. Thrivand was very interested in the sponsor idea, its head of PR, Delores Dayton came to Dublin to see Growing Places. During her visited in Growing Places, a local reporter was tagged along for the tour. Delores was very interested in the company asked Rob some questions. However, Robââ¬â¢s respond was impropriate. ââ¬Å"What gets me, though, is how long some of these kids nurse. If theyââ¬â¢re old enough to ask for a Coke, itââ¬â¢s time to move on.â⬠Growth The response gave Rob and theShow MoreRelatedInternal and External Factors952 Words à |à 4 Pagesto work for. The management team that is in place for Google is one that is ever growing and changing to better stream line operations within the company worldwide. In 1998, Co-founders Larry Page and Sergey Brin changed peopleââ¬â¢s lives all over the world by starting Google. As time has gone on the company has expanded to more than 20,000 employees all over the world. With the intentions to keep up with the ever changing market, Google has put in place a management team that represents the best andRead MoreWhat Did Your Organization Do Over The Past Year?1437 Words à |à 6 Pageshe or she will always be impacted by business processes, from creating a resume and interviewing for a job to working with executives in the corporate world. As Christians, we believe that our values should be incorporated into everything we do. Christian Business Leaders has been able to help students understand what place their values have in the job search and the work place. Our organization has hosted numerou s business speakers, including high profile leaders, to give talks on how they bring theirRead MorePanera Bread : A Successful Year For Panera Essay859 Words à |à 4 PagesPaneraââ¬â¢s objectives for 2010 included a target of 17%-20% EPS growth and to increase its gross profit per transaction. Strategies Corporate Strategy Panera is focused on ââ¬Å"growing store profit, increasing transactions and gross profit per transaction, using its capital smartly, and putting in place drivers for concept differentiation and competitive advantageâ⬠(Vincelette, 2010). The company has been able to make capital purchases through its cash flow operations. While many restaurantsRead MoreThe Five Dysfunctions Of A Team1141 Words à |à 5 Pagesits executive team, also known as ââ¬Å"The Staff,â⬠with very diverse characteristics to explain his five dysfunctions of a team. In addition to how each of the dysfunctions can hurt not only the team but also the company. ââ¬Å"The Staffâ⬠consist of: ï ¶ Kathryn Petersen, the new Chief Executive Officer of DecisionTech, was not a normal CEO of a technology firm. Kathryn a fifty-seven-year-old who had a military and automotive industry background yet she had an extensive history of turning executive teamsRead MoreTale of Lynx Essay1380 Words à |à 6 Pagesand I had to think hard about it â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ , that shows he never really believed in Doug but went ahead with Dough relying on Dougdââ¬â¢s experience with VC. Although Doug was considered a ââ¬Å"veteran entrepreneurâ⬠he did not contributed to Lynx as mush as a founder member is expected to do. He was not a technical expert so he couldnââ¬â¢t contribute technically to the company. He had lot of experience with VS but reading the case it doesnââ¬â¢t look like he had a lot fo contribution in getting funding for Lynx and itRead MoreThe Organization And Control System Essay852 Words à |à 4 PagesCupcakes-Palooza Organization and Control System Overview: Chris and Pat Anderson are majority shareholders and founders of Cupcakes-Palooza (CP), a privately held corporation located in Janesville, WI. CPââ¬â¢s office hours are Monday thru Friday 8:00 A.M. until 4:30 P.M. and bakery hours are 4:00 A.M. until 12:30 P.M. During bakery operations, CP produces and sells roughly 15,000 cupcakes weekly to selective grocery stores in the Janesville area. Despite a weak economy, sales have been steadyRead MoreS. A Building Case Study891 Words à |à 4 PagesStudentââ¬â¢s name University Table of Contents I. Executive Summaryâ⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦.3 Mission Management and Staff Marketing and Customer Base II. General Descriptionâ⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦.â⬠¦3 III. Background informationâ⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦..â⬠¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦Ã¢â¬ ¦4 Scope of Operations Emerging and Future Trends Finances and Pricing Our Place within the Industry Sales Strategy Overcoming Barriers to Entry Executive Summary Mission S.A Buildings Company mission is to construct qualityRead MoreEssay about Refresh Organics - Harvard Case719 Words à |à 3 Pagesï » ¿Summary ââ¬â This case looks a decision that George Hausman, the co-founder and CEO of Refresh Organics (RO), makes regarding creating a board of directors. RO is a midsize, steadily growing, privately owned company which is a distributor of organic produce. RO has never had a formal board of directors, but Hausman had several close business advisors who he consulted with regularly and referred to as ââ¬Å"the kitchen cabinet.â⬠Hausman considered putting together a true board of directors or if simply makingRead MoreThe, Young Entrepreneurs, Robert Kalin, By Chris Maguire1553 Words à |à 7 Pagesor B2B (business to business). Two years later in 2007, Etsy had nearly 450,000 registered sellers generating $26 million in annual sales and over one million sales. That same year, the company took in over $3 million in venture funding. ETSYà ¢â¬â¢s growing popularity as an online retailer of anything from abstract art to commonplace household curiosities was starting to get out of control. Deeply in need of leadership, Kalin hired Chad Dickerson, senior director of product at Yahoo. Dickerson was broughtRead MoreObama Administration : Presidential Power1585 Words à |à 7 Pagesembrace an almost unlimited view of Presidential responsibility and power . Thus, in the wake of a catastrophe, Congress was more than willing to grant emergency power so that the President could better protect the nation from harm. This expansion of executive authority represents the rule rather than the exception in American Politics. As a nation, we expect our president to do nothing less than solve all national problems and unite the country. Anything less is a failure. To match that responsibility
Tuesday, December 17, 2019
An Analytical Study Of Popular Biometric Tools And...
An Analytical Study of Popular Biometric Tools and Impacting Factors. Biometric tools have adapted and been refined with related research. This paperââ¬â¢s aim is to try to discover any key Biometric tools during a specific time. Content analysis was utilized on scholarly research from the field of Biometrics, trying to discover any patterns within scholarly publications. Specifically, are there any trends in different types of Biometric tools? Another key research question is are there any impacting factors or Biometric tools? Introduction: Biometric identification has long been used by humans for thousands of years. This recognition relies on certain body characteristics such as voice, face and movement. Body identification was first implemented by Alphonse Bertillon, in the Paris Police department in the mid19th century, to identify criminal body measurements (Jain, Ross and Prabhakar 2004). Later, a discovery of the individuality of fingerprints, was a significant discovery. This progressed into Police officers ââ¬Å"bookingâ⬠criminals fingerprints for identification. From the earlier time of biometric techniques to todayââ¬â¢s modern times, due to primarily the rapid development of technology, biometric research has focused on other biometric attributes. Methodology: An analytical tool primarily, content analysis, will be implemented to analysis a data set of scholarly literature from the Scopus database. Content analysis can provide a more summative and broader snapshot of such
Sunday, December 8, 2019
Turning Great Strategy Into Great Performance free essay sample
Every year the top management at many companies spend months for developing strategies. Years later the performance of the company is nowhere near what the plan had projected. Often leaders think that the execution failed, but in most cased they need a better strategy to stop their underperformance. To close this so called ââ¬Å"strategy-to-performance-gapâ⬠disciplined planning and execution processes are needed. In the fall of 2004 Marakon Associates surveyed companies translating their strategy into performance to analyze the most common causes and actions in closing the strategy-to-performance gap. In less then 15% of the analysed companies business results reached the performance plans, what offers the risk to embed the same disconnect between results and forecast in their future decisions. Companies do multiyear performance projections what creates the venetian blind phenomenon, including 3 problems. First, financial forecasts are unreliable; second, portfolio management gets derailed and the third problem is the communication with the investment community. We will write a custom essay sample on Turning Great Strategy Into Great Performance or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page Because of the poor forecast quality most on average strategies deliver only 63% of their potential financial performance, loosing performance by inadequate resources (7,5%), poorly strategy communication (5,2%) or the missing of clearly defined actions to execute (4,5%). Because of the difficult process to develop plans, allocate resources and track performance the top management doesnââ¬â¢t discern whether the gap is a result of poor planning, poor execution or both. They donââ¬â¢t know whether critical actions were expected, resources deployed on schedule, competitors respond as anticipated, so itââ¬â¢s impossible to take appropriate corrective action. The problem of a company creating unrealistic plans, which will not be fulfilled is a culture of underperformance, because it becomes the norm that performance commitments wonââ¬â¢t be kept. As a consequence closing this strategy-to-performance gap is the only way to realize more of the strategical potential, following these seven rules at planning and execution: Rule 1: Keep it simple and make it concrete ââ¬â use a clear language describing the course of action that everyone is clear about what the strategy is and isnââ¬â¢t, headed in the same direction. Rule 2: Debate assumptions, not forecasts ââ¬â a fact-based discussion lead to the result, that units canââ¬â¢t hide behind details and corporate centers executives canââ¬â¢t push for unrealistic goals. Rule 3: Use a rigorous framework, speak a common language ââ¬â each unit assesses what share of profit pool it can realistic capture, given its business model and positioning. The framework establishes a common language that all teams understand and use.
Sunday, December 1, 2019
The History of Cars
Introduction It is hard to imagine life without cars. Cars have formed an important part of our lives and the quality and power of the car one drives is used to define his/her standard of living. They have become a status symbol rather than their original intention as a means of transport. Cars provide an individualized and privatized means of transportation.Advertising We will write a custom essay sample on The History of Cars specifically for you for only $16.05 $11/page Learn More The motor vehicles did not just rise in a single day but rather have evolved from the earliest models to the more sophisticated automobiles dominating our roads today. It has been a gradual process starting from when the wheel was first invented and through several other stages. This paper traces the stages through the history that the cars have undergone from the primitive carriages of the 1880s to the fast, complex, and comfortable vehicle that dominate our roads in the 21 st century (Volti 1). The Invention of the Wheel This marked the infancy stage in the development of transportation known to us today. However, it is hard to know who or exactly when the wheel was invented but reports show that this may date back to more than 5000 years. The inspiration for the development of the wheel arose as people sought for easier and better means of moving things around. People had discovered that rounded objects could lessen the amount of effort needed if heavy things were placed over them and pushed along. The sledge was soon later incorporated as a means of moving things, it was recognized that when sledge was pulled on a smooth surface or on logs, the amount of energy needed to push it was reduced. The sledge was further improved by making grooves on the logs on which it was placed thus reducing the amount of friction between the logs and the sledge. This further lessened the amount of the effort needed and when the wood between the two inner grooves were cut, the wood that remained between the grooves now formed the axle. This type of sledge formed the first carts. The next natural thing that the inventors of the wheel did was to design the axle so that it could fit into the hole made in the centre of the wooden wheel. Further improvements on the axle were made to ensure that the axle remained static while the wheel was made to rotate on it. Further improvements on the design of the wheel were made in different parts of the world with an aim of fitting different purposes such as war chariots, racing carts, and freight wagons.Advertising Looking for essay on history? Let's see if we can help you! Get your first paper with 15% OFF Learn More The Age of the Horse Drawn Carriages During the first stages, the wheeled vehicles were pulled by people, oxen or horses but later the internal combustion engine were invented to replace ââ¬Ëhorse powerââ¬â¢ as it was called then. The use of horses to pull the carriages allowed for the people to wield more power and to expand their territorial borders. The amount of pollution caused by horse wastes in European cities resulted in inventors looking for alternative forms of transportation. The Horseless Carriages and the Steam Engine Steam-powered vehicle came into being in the late 18th century but were only considered more potentially practical in the early 19th century. Nicholas Cugnot (1725-1804) built the first steam powered vehicles that were supposed to haul French army artillery. It is recorded that his first steam vehicle travelled at a steady 3km/hr but run out of the steam power in less than 25 minutes (Volti 2). Cugnot second steam engine also failed prompting the government to drop the project. Development of Locomotives Early 1830s witnessed continued interest in the steam as a source of power and the period continued to experience development in locomotives, steam powered tractors, and other forms of vehicles. In England, automobiles powered b y steam were on the rise but their growth was terminated immaturely as more emphasis was laid on rail locomotives. The Role of the Bicycle in the Development of Cars There was great stride in the development of bicycles during the 1840s. The development of the bicycle is very important in the history of cars as most of the parts found in the early cars owed their origin from them. Such parts as the chain-and-sprocket drive, the tires, bearings, spooked wheels and many other components of the automobile were derived from the bicycle. Steam cars faced limitations in the building technology and lack of good roads and it was only in the 19th century that personal transportation emerged. These cars were very heavy thus meaning that they could only travel on rails to function effectively. The steam engines were therefore used on railroads leading to the great success of the railroad industry. The steam engines operated by burning wood or other fuel and the thus generated heated water in b oilers. The resulting steam was used in driving pistons up and down and in the process turning the crankshaft, which ultimately moved the wheels. These steam engines required numerous stops to replenish its water and also needed long time to start. The Internal Combustion Engines Early attempts The invention of internal combustion engines was influenced by the idea of personal mobility. This venture required that individual vehicles have a source of power to drive them.Advertising We will write a custom essay sample on The History of Cars specifically for you for only $16.05 $11/page Learn More As already seen, steam power was out of question and could not be used in personal cars because of their weight and the fact that they could only use rails. This led to the idea of an internal combustion engine that contained an air-fuel mixture within it. Attempts to make an internal combustion engine dates back to the late 17th century; when efforts to use gunp owder were made but failed. Lenoirââ¬â¢s double acting engine In 1850s, a French engineer, Etienne Lenoir constructed a double acting engine in which an ignited mixture of air and gas was used to push the piston to the far end and of the cylinder creating a power stroke. When this piston was pushed back I the same mechanism, another power stroke was created and the exhausted gas was expelled. Therefore, for each revolution of the crankshaft, two power strokes were produced. This kind of engine had its disadvantage in that the air was not compressed before being burned and therefore the engine produced less power and was inefficient. However, Lenoirââ¬â¢s engine was far much better when compared to the steam engine since it had a higher thermal efficiency. Early development in Germany In Germany, Nicholas Otto also made significant steps in the development of steam engine. Together with his friend Eugen Langen (1833-1895), Otto developed a four-stroke cycle in 1876. This model of engine was however not efficient but was a gap in the building of more improved combustion engines. Daimlerââ¬â¢s first ââ¬Ëcarââ¬â¢ In 1885, two of Ottoââ¬â¢s workers; Gottlieb Daimler (1834-19000) and Wilhelm Maybach (1847-1929) made improvements on Ottoââ¬â¢s four stroke engine by installing a single cylinder engine on a two wheeled frame which came to be referred to as the first motorcycle. The same engine was installed onto a four wheeled wagon to make the first internal combustion engine car (Volti 4). Carl Benz (1844-1902) constructed a three wheeled vehicle that used Ottoââ¬â¢s four stroke combustion engine and was better than that constructed by Daimler (Flink 11). Carlââ¬â¢s three wheeled vehicle marked the beginning of personalized road trips when his family made a 200 kilometres journey in it. The progress of the internal combustion engine in France Germany is credited with the manufacture of the first cars but credit also goes to France where con siderable steps in the motor industry. Peugeot, a steel metal company, constructed the first car that used a v-twin engine which was a Daimler engine design. Peugeot later on produced its own design of engine which used independently pivoting wheels.Advertising Looking for essay on history? Let's see if we can help you! Get your first paper with 15% OFF Learn More Another French company, Panhard et Levassor also introduced a car that was more sophisticated than the Peugeot. Panhardââ¬â¢s model of engine, termed systeme panhard had its engine mounted on the front and operated by turning the rear wheels via a driveshaft that run underneath the car. The United States of America joins the car industry The united states marked a slow start in the motor vehicle invention and remained stuck in the ââ¬Ëbuggyââ¬â¢ despite making big strides in the manufacture of other industrial products such as watches, typewriters and fire arms at relatively low costs. Its first internal combustion engine automobile was designed by Charles and frank Duryea in Massachusetts in 1893. The car was propelled by an engine with a single-cylinder and contained a spray carburetor and electric ignition. In 1894, the first gasoline car was made by Elmer and Edger Apperson using the Hayneââ¬â¢s design. The Duryea motor wagon company came into existence in 1895 and s pecialized in gasoline cars. Henry Ford first built his two cylinder engine car in Detroit and only registered the Ford Motor Company in 1903. Other important car builders in US during the early stages included; Ransom E. Olds and William C. Durant who founded General Motors in 1908. Electric Cars This mostly came into being in the 1830s but failed due to the fact that early batteries were limited in their capacity to store energy. Because they operated within towns, the cars were relatively advantageous since they did not require to be occasionally replenished with water (Larmine lowry 93). They could also travel longer distance comparatively. However their prominence dwindled in the 1900s as the advantage was taken by the gasoline cars. Other factors that led to the near demise of electric cars were the expansion and betterment of roads between cities thus creating the need for long range cars. Gasoline availability also meant that gasoline cars were easier to maintain than the e lectric cars. Another brow to the electric cars was rendered by the invention of the electric starter by Charles Kettering in 1911. The great success in the evolution of automobiles, prior to this development; vehicles powered by gasoline were started by a hand crank which was more dangerous and difficult to use. Finally, the initiation of mass production of vehicles using internal combustion engine made their availability and affordability possible when compared to electric cars (Mom 98). During the years that followed (1911-1960s), there was an almost complete disappearance of the electric cars. The years between 1960 and 1970 experienced an urge to re-introduce electric cars. This was mainly because of the increase in air pollution caused by the internal combustion engine cars and the rise in the prices of crude oil. There followed many attempts by various companies to come up with electric trucks that would be easy to maintain. Recent developments have been aimed at producing en vironmental friendly vehicles and emphasis has been laid on electric cars. Among some of the modern electric cars in the market include the Toyota RAV4 sport, Honda EV Plus sedan and several models of Chrysler. Manufacturing Methods Early car makers employed similar techniques in the manufacture of cars. This technique were however similar to those used in heavy engineering industries. Early automobile companies were initially bicycle makers such as Peugeot and the Riley in Britain. These motor companies used skilled workers in modest workshops but as volume of production increased, there was change to batch system. In the one-off system, workers and parts moved to the areas of the workshop where the car was positioned. Accessory machines were also grouped in regard to the type of work performed by each. Conveyer belts were first used in Henry Fordââ¬â¢s workshop. Assembly of the car parts was later done in one location. The first moving assembly line was used in Fordââ¬â¢s wo rkshop. The chain driven assembly line replaced the sliding rail system in the ford workshop. Modern manufacturing techniques are highly automated and in some companies, most of the work is done by robots. The final results of modern technology are faster, more comfortable and reliable cars. Modern Internal Combustion Engine Cars There is a great deal of differences in cars we have today and those present in the early years. However, it is worth mentioning that the principles have relatively remained the same and only the outlook and few other aspects have been changing. Internal combustion engines may use diesel or petrol. However, for a long time, diesel powered vehicle were neglected but are now making a comeback because of their high efficiency and long life. Diesel powered vehicle can also burn other types of fuel. These vehicles are more expensive compared to gas cars. There are numerous car manufacturing companies today when compared to the earlier years. The US is today the highest producer of motor vehicles in the world today while Japan is the second. The amount of cars in the world today exceeds 1 billion with the number expected to rise in the near future. This large number of cars continues to cause far reaching effects on the environment due to pollution thus raising the issue of environmental friendly cars (Walsh 4). Conclusion It is true that the invention of cars completely changed the way of life of man. It was a gradual process that took place over a long period and underwent through numerous stages and processes. The history of the car spans back to about 250 year and took place in several countries in Europe and in the US. German is credited with being the place of birth of the motor cars. This important technical invention has helped shape various cultures in the world Glossary Automobile- a wheeled motor vehicle with own engine and used in transport Axle- a small shaft around which a wheel rotates Battery- electrochemical cells designed to convert chemical form of energy into electrical energy Buggy-a light carriage that is pulled by one horse or oxen Carburettor- a component of an internal combustion engine that mixes air and fuel Carriage ââ¬â horse drawn vehicle. Chariot- a carriage that is drawn by horses and mainly used in ceremonies Combustion-the process of reacting oxygen with a given substance to yield heat and light Conveyer belts- a mobile belt that is used in industries to transport objects Crankshaft-a shaft found in cars that rotates when driven by a crank Crude oil- dark oil containing many hydro carbons Demise- death Diesel- heavy oil Driveshaft- a metal shaft that help transmit rotary power from the point of production (engine) to the point of application. Dwindle- to decline or decrease Gasoline- this is a very volatile mixture of several gases that comes from petroleum and functions as fuel for vehicles Groove- a furrow cut in wood Hayneââ¬â¢s design- an early design in the manufacture of engines in which it was placed in the front and the power transmitted to the rear of the car Ignition- process of making something catch fire Petrol- gasoline Piston- is a part of the internal combustion engines that is found in a cylinder and is used to channel power from the expanding gas to the crankshaft Pollution- contaminating the environment with harmful unwanted substances Propel- use of force to make an object move forward Railroad- metal road on which trains travel Replenish- refill Robot- a device designed to move automatically Sledge-a small vehicle that is pulled by a dog or a horse Spooked- wheel- a wheel with wire or wooden devices that held the axle in place Steam- vapour produced when water is heated. Works Cited Flink, James. The Automobile Age. USA: MIT press, 1998. Larmine and lowry. Electric Vehicle Technology Explained. USA: John Willey and Sons, 2003 Mom, Gijs. The Electric Vehicle: Technology And Expectations In The Automobile Age. USA: John Hopkins Univers ity Press, 2004. Volti, Rudi. Cars and Culture: The Life Story of a Technology. New York: John Hopkins University press. 2004 Walsh, Michael. Moving Toward Clean Vehicles and Fuels: A Global Overview. New York: Air and Waste Management Association, 2004. This essay on The History of Cars was written and submitted by user Ciara Lang to help you with your own studies. You are free to use it for research and reference purposes in order to write your own paper; however, you must cite it accordingly. You can donate your paper here.
Tuesday, November 26, 2019
Cross-Species Virus Transmission Essays
Cross-Species Virus Transmission Essays Cross-Species Virus Transmission Essay Cross-Species Virus Transmission Essay Harmonizing to the article supported by the American Society of Microbiology, viruses have one time once more been doing a splash in footings of giving rise to new diseases through their ability of host shift. SARS, Ebola febrility and Influenza are a few illustrations in which viruses have underwent transmittal from wildlife hosts onto human hosts. Furthermore, it was reported that when the HIV/AIDS virus crossed the species barrier from archpriest to human about 70 old ages ago, a major menace arose for a great sum of people were infected and still go on to be infected today. As outlined by the writers, there are three specific phases in which viral diseases emerge and successfully exchange from giver host into recipient host. Therefore, this article focuses upon and analyze the variables that affect the success rate of emerging viral diseases by the manner they influence the three phases antecedently noted. The diverseness of these variables which affect the mechanism of host exch anging consist of environmental and demographic barriers, host barriers, bing host scopes, and viral development in footings of transmittal, recombination, reassortment and viral intermediates. In order for host shift to be successful, there has to be interaction between the virus and the possible new hosts. However, if contact between the two is either prevented or limited, so the likeliness of transportation is weakened. This barrier is discernible in the HIV virus, which prior to its planetary outgrowth was transferred to worlds several times with small success because of the limited interaction between Primatess and homo. However, one time Primatess were able to come into contact with a big adequate human population, transmittal became successful and the effects can be viewed today for infections still arise. Figure 2 in the article shows the transportation of viruses into human host populations with small or no transmittal along with the occasional viruses that are able to emerge and do epidemics. Based upon findings, assorted demographic factors and human behaviours such as going, endovenous drug usage, sexual patterns and contacts, farming patterns, and agricultural enlargement addition viral host shift and advance the outgrowth of new diseases. As confirmed by the writers, human trade and travel forms were able to distribute insect vectors of viruses and viral pathogens such as SARS while migratory birds transporting the Influenza A virus were able to cross across a broad scope of populations. In add-on, ecological alterations brought upon by human actions have impacted the outgrowth of the Nipah virus in Malaysia. For illustration, chiropterans are considered to be the reservoirs of the Nipah virus, and therefore when people decided to works fruit groves around pig farms, the chiropterans became attracted to the groves and caused a spillover which infected the hogs. In bend, people working with the septic hogs became exposed to the virus and this caused an addition in carnal virus transmittal. Host barriers are the 2nd variable examined in the article and proven to impact the mechanism of host shift. In order for transmittal to be a possibility, a virus has to be able to infect cells of a new host. Yet, this procedure can be delayed at assorted degrees such as receptor binding, entry into the cell, genome reproduction or cistron look. Based on these multiple host barriers, the virus would hold to undergo alterations to be able to get the better of everything therefore increasing the trouble of transmittal. Besides this article states that unconditioned antiviral responses from host cells and apolipoprotein B-editing catalytic polypeptide proteins ( APOBEC ) further impede the hazard of infection by barricading infection to subsequent cells. In analyzing evolutionary relatedness, species that are closely related to one another have an increased likeliness of viral host shift as viewed between Pan troglodytess and worlds, therefore ensuing in the constitution of HIV. On the other manus, due to relatedness, certain restrictions based on cross-immunity to related pathogens and innate immune oppositions to related viral groups arise. Another facet to host barriers is the physical entry of the virus into the cell. Upon entry there are host glycans or lectins which bind to the virus particles to forestall infection. Besides, deficiency of neuraminidase proteins, used in the procedure of emersion, do viral inactivation which farther AIDSs in the bar of transmittal and outgrowth. Because viruses are specific to their appropriate host, they are besides specific to the assorted receptors in which they bind to the host cell. For illustration, the HIV virus binds specifically to CD4 host receptors whereas avian viruses recognize sialic acids found on host cells. Aside from receptor binding, there are besides intracellular limitations which decrease viral transmittal. For illustration, interferon responses are found to be host specific and therefore more likely to protect cells against viruses. This can be observed by alpha and beta interferons which restrict the murine norovirus from come ining the host cell. The writer further examined the host ranges of viruses and whether or non they were a factor in host shift. It was conjectured that preexisting host ranges influence the ability of a virus to be established in a new host. Viruss were classified as either Renaissance man, infecting many different hosts, or specializer, infecting merely a few related hosts. The outlook was that Renaissance man viruses would demo a greater likeliness of switching to extra hosts whereas specialist viruses would be more inclined to limitations of host shift. However, looking at the informations in Table 1, it became evident that both Renaissance man and specializer viruses have transmitted successfully into new hosts therefore decreasing the overall generalisation antecedently made. The last variable under reappraisal for impacting the mechanism of host shift is the viral evolutionary mechanisms which consist of viral fittingness tradeoffs, manners of virus transmittal, recombination and reassortment and viral intermediates. It has been speculated in the article that cross-species transmittal is more common in quickly germinating viruses. This means the greater the rate of fluctuation, the more likely a virus is able to accommodate to a new host and undergo transmittal. Because RNA viruses lack proofreading mechanisms and incorporate big viral populations, they are more likely to undergo development and transmit within a new host. Yet, there is grounds that some RNA viruses have developed host specialisation and that rates of fluctuation of DNA viruses should non be underestimated in comparing to RNA viruses. Because viruses are able to undergo assorted mutants, this increases their ability to infect new hosts but in the long tally reduces their fittingness in t he giver host. By detecting Figures 2 and 3 in the article, this is deemed fitness tradeoff. However, non all mutants cause a decrease in fitness tradeoff for there are a few advantageous 1s that increase fittingness. Besides, when merely a few adaptative mutants are required between giver host and receiver host, transmittal becomes more efficient. Equally far as outgrowth and successful host transportation are concerned, the manners of virus transmittal cause restrictions. For illustration, if viruses are non able to last between giver, receiver and or vector hosts, so outgrowth becomes a challenge. Furthermore, different tracts of transmittal either by droplet spread, sexual vaccination and or fecal-oral represent challenges in suiting assorted hosts. Recombination and reassortment brand viruses more susceptible to familial alterations that are good in the long tally. In comparing RNA and DNA viruses, the possibility of recombination varies but in detecting the retrovirus HIV, the re is a high rate of recombination which may correlate to its effectual outgrowth. Figure 5 examines possible functions of recombination in the HIV virus every bit good as its beginning from other Primatess. Another illustration of a recombination virus can be viewed in SARS CoV which most likely arose from a combination of a CoV virus and another chiropteran virus before infecting human hosts. Once a virus has been able to exchange to a new host, it farther uses recombination and reassortment in the procedure of version. Last, there have been instances in which viral intermediates with lower fittingness were required in the procedure of successful transmittal. Aside from viruses accommodating to their new hosts they besides have to optimise their strength in the host cells at the same clip hedging any immune responses. By being able to observe viruses that do non distribute expeditiously, there would be a greater opportunity of commanding epidemic eruptions. Reading through this scientific diary, the chief intent of the article is to show the audience with information on how viruses are able to traverse species and convey approximately new epidemic diseases. The writers largely define the barriers that a virus has to get the better of in order to obtain successful transmittal and overall outgrowth by forming the information into assorted subheadings. For illustration, in the subdivision Environmental and Demographic Barriers to Host Switching, the writers compare human actions such as travel or trade and associate these to viral host shift and the potency for infection. In the following subdivision, they focus on the host and the barriers that the virus has to get the better of in order to give rise to efficient transmittal. Basically the virus has to be able to come in into possible host cells and in making so has to adhere to receptors and so either blend or undergo endocytosis. However, host cells have assorted mechanisms by which the y are able to halt a virus from distributing and doing farther infections. Following, there is a treatment about the relationship between the host scope of a virus and whether or non it is a factor in finding the likeliness of host shift. The writers province that regardless of the host scope of a virus, both Renaissance man and specializer viruses have been successful in transmittal into a new host. Last, the writers examine mutants and versions of viruses in response to their ability to emerge and do diseases. These last few paragraphs see what occurs one time a virus has entered into a host cell and its ability to keep entry by undergoing advantageous mutants or farther accommodating to the receiver hosts. The article ends with a basic sum-up or overview of all the information presented earlier with the mentality of being able to command future epidemic diseases. Upon reading this diary article, the manner of authorship is instead complex in certain subdivisions which would suppr ess the audience from to the full understanding the stuff. Besides, there look to be no major experiments or trials conducted by the writers for the diary article merely contains basic information and referenced informations. Further, in rather a few subdivisions, the writers use phrases such as, poorly understood and we know comparatively little in discoursing the information. This would most likely make the audience disbelieving as to the writers competency and apprehension of the information being presented. In add-on, while showing their information, the writers chiefly focus on few specific viruses such as HIV, SARS and Influenza. What about the remainder? Are other viruses less prone to host shift or traversing barriers? These facets need to be addressed so that the audience is non able to chew over and do generalisations. Last, no solution is stated as to the bar of future epidemics from the possibility of viral host shift. The lone statements made in the diary consisted of being able to better understand the information presented and the complexnesss that follow this subject. Overall, this article should merely be used as background information in helping old cognition on the subject. The writers accomplish the undertaking of supplying the audience with information, but the manner and the presentation should hold been executed otherwise.
Friday, November 22, 2019
The Philosophy of Avenue Q Lyrics - An Analysis
The Philosophy of Avenue Q Lyrics - An Analysis Avenue Q Lyrics - The Philosophy of Avenue Q Lyrics During a recent visit to London, I wandered through Covent Garden on my way to watch a West End production of Avenue Q. While passing various shops and street performers I spotted a large plaque placed on the walls outside of St. Pauls church. It was here, said the sign, that the famous Punch and Judy Shows were performed during the 1600s. Thats right, Shakespeares plays had to compete with puppet shows. In traditional Punch and Judy shows, the anti-hero Punch insults, pesters, and beats his fellow characters, much to the delight of the audience. The Punch and Judy shows were a glorious display of political incorrectness. Today, the tradition of puppets delivering obnoxiousness and social commentary continues with Avenue Q. The Origin of Avenue Q Avenue Qs music and lyrics were created by Robert Lopez and Jeff Marx. The two young composers met in the late 90s while involved in the BMI Lehman Engel Musical Theater Workshop. Together they have written songs for Nickelodeon and The Disney Channel. However, they wanted to create a puppet-friendly show that was strictly for adults. With the help of playwright Jeff Whitty and director Jason Moore, Avenue Q was born - and has been a hit Broadway show since 2003. Sesame Street for Grown Ups Avenue Q could not exist without Sesame Street, the long running childrens show that teaches kids letters, numbers, and practical life-lessons. The premise of Avenue Q is that adolescents grow up without learning the truth of adult life. Like the puppet protagonist Princeton, many new grown-ups experience anxiety and confusion when entering the Real World. Here are some of the lessons offered by Avenue Q: School / College Does Not Prepare You for Real Life With songs like What Do You Do with a B. A. in English? and I Wish I Could Go Back to College, Avenue Q lyrics portray higher education as an extended stay in the carefree Land of Adolescence. Princetons main conflict is that he is drifting through life, trying to discover his true purpose. One would hope that college would establish this sense of purpose (or at least a sense of self-sufficiency), but the puppet croons to the contrary: I cant pay the bills yet / Cause I have no skills yet. / The world is a big scary place. The ensemble of characters, both human and monster, wistfully recall the days when they lived in a dormitory with a meal plan, a time when if things got too difficult they could just drop a class or seek an academic advisors guidance. This criticism of the education system is nothing new. Philosopher John Dewey believed that public education should proactively prepare students with useful critical thinking skills rather than just facts from books. Modern day critics such as John Taylor Gatto further explore the failures of compulsory learning; his book Dumbing Us Down: The Hidden Curriculum of Compulsory Schooling explains why many people feel the same social / intellectual impotence expressed within Avenue Qs lyrics. The Freedom to Find Our Own Purpose Princeton decides that he should seek his purpose in life. At first his quest for meaning is guided by superstition. He finds a penny from the year he was born and considers it a supernatural sign. However, after a couple a false-start relationships and a dead-end job or two, he realizes that discovering ones purpose and identity is a difficult, never-ending process (but an invigorating process if one chooses to make it so). Steering away from lucky pennies and mystical signs, he becomes more self-reliant by the musicals conclusion. Princetons resolution to find his own path would be smiled upon by existential philosophers. The main component of existentialism is the assumption that humans are free to determine their own sense of personal fulfillment. They are not bound by Gods, destiny, or biology. When Princeton laments, I dont even know why Im alive, his girlfriend Kate Monster replies, Who does, really? A rather existential response. There Are No Selfless Deeds Perhaps there are good deeds, according to Avenue Q, but there seem to be no purely selfless deeds. When Princeton decides to generate money for Kates School for Monsters, he does so because it feels good to help othersâ⬠¦ and he also hopes to win her back, thereby rewarding himself. The lyrics from Avenue Qs Money Song explain, Every time you do good deeds / Youre also serving your own needs. / When you help others / You cant help helping yourself. This bit of wisdom would please Ayn Rand, author of controversial classics such as Atlas Shrugged and The Fountainhead. Rands concept of objectivism which specifies that ones purpose should be the pursuit of happiness and self-interest. Therefore, Princeton and the other characters are morally justified in performing good deeds, so long as they do so for their own benefit. Schadenfreude: Happiness at the Misfortune of Others If youve ever felt better about your life after watching the miserable guests on a Jerry Springer re-run, then youve probably experienced schadenfreude. One of the Avenue Q characters is Gary Coleman, a real-life child star whose millions were squandered by his irresponsible family. In the show, Coleman explains that his personal tragedies make others feel good. Ironically, it becomes a virtue (or at least a public service) to be a wretched failure or a victim of calamity. (This by the way would would be frowned upon by Ayn Rand). Characters such as Coleman and the recently homeless puppet, Nicky, improve the self-esteem of the mediocre masses. Basically, these lyrics make you feel better about being a loser! Tolerance and Racism Avenue Q Heterosexual puppet Nicky tries to help the sexually repressed puppet Rod come out of the closet. He sings, ââ¬Å"If you were queer / Iââ¬â¢d still be here / Year after year / Because Youââ¬â¢re Dear To Me.â⬠A bit more devious (in a good way) is the song ââ¬Å"Everyoneââ¬â¢s A Little Bit Racist.â⬠During this number, the characters proclaim that ââ¬Å"everyone makes judgments based on race,â⬠and that if we accepted this ââ¬Å"sad but trueâ⬠premise society could ââ¬Å"live in harmony.â⬠The songââ¬â¢s argument might be specious, but the audienceââ¬â¢s self-deprecating laughter throughout the musical number is very telling. Everything in Life Is Only For Now Recently, ââ¬Å"spiritualâ⬠books such as Eckhart Tolleââ¬â¢s have been asking readers to focus on the present, to embrace ââ¬Å"The Power of Now.â⬠(I wonderâ⬠¦ Does this message anger historians?) In any case, this currently popular concept stems from ancient times. Buddhists have long since explained the impermanence of existence. Avenue Q follows the Buddhist path in its final song, ââ¬Å"For Now.â⬠These cheerful Avenue Q lyrics remind the audience that all things must pass: ââ¬Å"Each time you smile / Itââ¬â¢ll only last a while.â⬠ââ¬Å"Life may be scary / But itââ¬â¢s only temporary.â⬠In the end, despite its zaniness and crude jokes, Avenue Q delivers a sincere philosophy: We must appreciate the joys and endure the sadness we currently experience, and acknowledge that all is fleeting, a lesson that makes life seem all the more precious. Why Puppets? Why use puppets to deliver the message? Robert Lopez explained in a New York Times interview, ââ¬Å"Theres something about our generation that resists actors bursting into song on the stage. But when puppets do it, we believe it.â⬠Whether itââ¬â¢s Punch and Judy, Kermit the Frog, the cast of Avenue Q, puppets make us laugh. And while we are laughing, we usually wind up learning at the same time. If a regular human were on stage singing a preachy song, many folks would probably ignore the message. But when a muppet talks, people listen. The creators of Mystery Science Theater 3000 once explained that, ââ¬Å"You can say things as a puppet that you canââ¬â¢t get away with as a human.â⬠That was true for MST3K. It was true for the Muppets. It was true for the bombastically cruel Punch, and it is eloquently true for the ever-insightful show Avenue Q.
Thursday, November 21, 2019
Martial Arts In Renaissance Europe Essay Example | Topics and Well Written Essays - 1750 words
Martial Arts In Renaissance Europe - Essay Example The gathered evidences are either in the form of scriptures of paintings. It is important to mention the event of Gladiator, this event occurred in 260 BC in Rome. The gathered evidences depict the wrestling techniques which were practiced during old ages, and with specific tools for self-defence and offend were designed and used. As per historic record, the groups practiced special martial arts during crucial combats. The pictorial display of historic combats have indicated the compilation of drafts i.e. Bayeux Tapestry and Morgan Bible, which has exclusively listed the details of technique necessary for combat alongside the design of the tools. As per European history, a specific manual has been discovered which depicts the learning of martial arts, "extant dedicated martial arts manual is the MS I.33 (ca. 1300), detailing sword and buckler combat". The book has mentioned that during high and late middle ages, common martial arts included jousting, fencing system. It was unfolded t hat during the period of late Middle Ages, different books on martial arts and fighting i.e. Fechubucher were compiled, regarded as "instructional treatises" (Mangan, 2001). The history of affiliation between the Europe and martial arts is significant and bonded. The literal meaning of Martial means the arts of Mars, which Mars is the reference to the Roman god of battle. It is therefore justifiable to link the history of martial arts with Ancient Greece. The literature of martial arts has been gathered and compiled by the European historians, have originated from the tradition of Mediaeval and Renaissance Europe, the compilation is in the form of treatise which included details of the combat techniques. The Europe has history of political and local struggle against injustice and mutiny, therefore on several accounts the mention of martial arts have been observed, which in actual was intended to communicate the fundamentals of defence to the forces and public (Jane, 1995). The conte mporary martial arts have its origination from the tradition of Mediaeval and Renaissance Europe. Several schools have been identified which have identified specified nature of practices relevant to martial arts, out of these some schools have been categorised as Italian, Spanish, German and English style; however some schools have focused upon weapons combat, mainly related to sword. The important institute linked to the promulgation and promotion of martial arts inside Europe includes the Academy of European Swordsmanship, this school has researched upon traditional swordsmanship. The book has given the details of the classified information relevant to this specific practice of martial arts includes strikes, locks and breaks, throws, wrestling, and disarms. The nature of these martial arts is limited to weapon and self combat, however the nature of martial arts practiced during Renaissance focused much upon hand-to-hand combat. The martial arts practices which originated during pe riod of Renaissance have transformed into boxing and fencing. The core values of the martial arts related to the European period of Renaissance focused primarily upon defensive and combat techniques, specially "learning to defend against knives, empty hand, ground fighting, pole weapons and swords" (Mangan, 2001). Discussion There growing misconception among the public is regarded the origination of martial art
Tuesday, November 19, 2019
Gay marriage in China Research Paper Example | Topics and Well Written Essays - 2000 words
Gay marriage in China - Research Paper Example In China, very small achievements surrounding the issue of gay marriages have been realized. According to Fedorak, ââ¬Å"in 1997, the law that outlawed sodomy was repealed, and in 2001, homosexuality was no longer classified as a mental illnessâ⬠(90). The have been numerous attempts to have the marriage laws that only recognize straight marriages amended, but they have all been unsuccessful. This paper is an argumentative research essay that proposes that gay marriages should not be legalized in china. This is after a thorough look, discussion and reflection of both sides of the argument. The existence of homosexuals in China is a reality: Same sex relationships have existed in China for a very long time. According to Newton, ââ¬Å"long-term same-sex loving relationships were common during certain periods of early Chinese history, with at least 10 emperors between the period 206 BCE and 1 CE known to have been involved in such relationshipsâ⬠(5). This however does not mean that homosexuality was accepted; it was rather fairly tolerated. Drescher and Lingiardi point out that ââ¬Å"it was only after 1949 that homosexual behavior was seriously punished in China and served as grounds for persecution during Chinese political upheavals between the 1950s and 1970sâ⬠(117). From this, it is clear that the Chinese people have always accepted homosexuality and its existence, but not its legalization. All Chinese citizens have equal rights: The only reason the law should deny people their rights is in instances where the rights are against the law. The Chinese criminal law has no specific statement that describes the status of homosexuals or whether it is illegal or not. Only the following statement exists in the nationââ¬â¢s laws: ââ¬Å"all hooliganism should be subjected to arrest and sentenceâ⬠(West and Green 63). In this case hooliganism means any disruption of social order. Since homosexuality is greatly condemned in the Chinese society and viewed as
Sunday, November 17, 2019
Dominating the poem Essay Example for Free
Dominating the poem Essay Ode to Nightingale is an antithesis of life and death, with death very much dominating the poem (Keats suffered from tuberculosis, and his description of men suffering in Ode to a Nightingale could indicate that he himself was in great pain when he wrote the poem), whereas The Prelude describes a conflict between man and nature, and Ode to Autumn is simply admiring an aspect of nature. However, Keats and Wordsworth both allude to ideals expressed in the philosophical viewpoint Romanticism. Wordsworth thought that the individual could understand nature without society or civilisation, and this is the stance that he takes in The Prelude. The metaphor of a single person in a boat in the middle of a huge lake represents one person in isolation from society. The mountain that towers over the person in the boat represents the raw power of nature, so much more powerful than a mere human (a Romantic ideal is that nature comes first, while people and their thoughts and activities come second. Wordsworth takes it to extreme in The Prelude with his descriptive comparison of the huge peak, black and huge and the little boat. The imagery comes across very vividly in the poem, and man seems insignificant when compared with the huge and mighty forms, that do not live like living men.) Keats also expresses his idea of the power of nature, but from a different viewpoint. He does not see nature as raw, wild power that is a colossus compared with trivial humans. He instead regards nature as a friend in suffering (in Ode to a Nightingale: Now more than ever it seems rich to diewhile thou art pouring thy soul abroad) and as a thing with its own magic (Ode to Autumn: Where are the songs of Spring?Think not of them, thou hast thy music too) In Ode to a Nightingale Keats also sees the nightingale as a thing of immense spiritual power, something so powerful that it can trigger his imagination and send him into a fantasy world of verdurous glooms and winding mossy ways where he can forget his pain for a short while, even though afterwards he is forced to realise that his poetry cannot help him escape his pain permanently (the fancy cannot cheat so well as she is famd to do, deceiving elf.) This is another similarity which the two writers share: they both describe spiritual experiences that have happened to them. Wordsworth describes the effect that the view of the megalith mountain had on him (but after I had seen that spectacle, for many days, my brain worked with a dim and undetermined sense of unknown modes of being) and describes his feelings of solitude and blank desertion that were a trouble to his dreams. Keats uses a lot of very entrancing imagery (soft incense, embalmed darkness, pastoral eglantine, musky rose, full of dewy wine and murmurous haunt of flies all create a very clear picture of the fantasy world Keats has conjured up in his imagination, influenced by the song of the nightingale) and emotive language (the poem is full of exclamations such as Away!, Adieu! and Folorn! that seem almost like laments, especially in the case of thou wast not born for death, immortal Bird!) in Ode to a Nightingale, succeeding in drawing the reader into an bond with his thoughts where they can see, hear and smell everything that Keats is experiencing. This sort of empathy through poetry is very difficult to achieve, though Keats also manages it in Ode to Autumn through his descriptions of season of mists and mellow fruitfulness. Keats does not reflect much on his experience in Ode to a Nightingale, except only to wonder was it a vision, or a waking dream?Do I wake or sleep? However, this last question lets the reader themselves reflect on the meaning of the nightingale (though throughout the poem the references to easeful death and Darkling make it obvious that the bird symbolises death.) Keats and Wordsworth have widely different styles of writing. Their poems greatly differ in language form and structure, especially between Wordsworths simple language and Keats traditionally embellished diction. However, both poets have had troubled times in their lives, and their poems (Ode to a Nightingale and The Prelude) reflect this. They both portray their spiritual encounters with nature as having had a great effect on them, which is in keeping with the Romantic ideals of nature and spirituality. They also express their Romantic views of nature as a source of power, though they have different views on the type of power that nature possesses.
Thursday, November 14, 2019
Metamorphosis of Eliza Doolittle in Pygmalion by George Bernard Shaw Es
The Metamorphosis of Eliza Doolittle in Pygmalion by George Bernard Shaw à The benefits of acquiring an education are not limited to the academic aspects often associated with it. Part of the edification it bestows includes being enabled to reach new insight, being empowered to cultivate a new awareness, and being endowed with a new understanding of life and of self. In Bernard Shaw's Pygmalion, Eliza Doolittle experiences this type of enlightenment as the result of undergoing a drastic change in social status. With the sponsorship and guidance of Colonel Pickering, Eliza, a common street flower vendor, receives phonetic instruction from Professor Henry Higgins and is transformed into an elegant and refined "duchess" (817). Eliza Doolittle is highly emotional and has dauntless pride; however, her level of confidence increases as she gains a new perception of herself and a new outlook on life through the instruction she receives. Although in the beginning of the play Eliza Doolittle possesses a dignity of self that has persevered despite the lowliness of her social status as a "draggletailed guttersnipe" (817), she has little confidence and a low sense of worth. By describing Eliza's emotional states throughout the play, Shaw illuminates the evolution of Eliza's character. In the opening act when Eliza receives the impression that she is being "charged" for "taking advantage of [a] gentleman's proximity" to persuade him to "buy a flower," Shaw describes that she becomes "terrified" and claims, "I ain't done nothing wrong . . . I've a right to sell flowers . . ." (806). Eliza's initial feeling of fear points to a momentary sense of self-doubt in her character; however, her solid pride leads her to make a declaration in def... ...f" as she "sweeps out" (864). Too proud to be bossed around, Eliza is confident enough to stand her ground and defend her dignity without being timid. Although it was in Eliza's sensitive nature to "fetch slippers," now she "won't care for anybody that doesn't care for [her]" (860). Eliza Doolittle continually manifested pride and a touchy sensitivity; however, once educated, the drastic change of experiencing a substantially improved social standing caused the development of visual confidence in her character. Armed with self-esteem, Eliza had the necessary force in her character to face adversity without doubting herself or relying on the strength of others. Works Cited Shaw, Bernard. Pygmalion. Introduction to Literature: Reading, Analyzing, and Writing. 2nd ed. Ed. Dorothy U. Seyler and Richard A. Wilan. Englewood Cliffs: Prentice, 1990. 800?64.
Tuesday, November 12, 2019
Chameleon Chips
INTRODUCTION Today's microprocessors sport a general-purpose design which has its own advantages and disadvantages. ? Adv: One chip can run a range of programs. That's why you don't need separate computers for different jobs, such as crunching spreadsheets or editing digital photos ? Disadv: For any one application, much of the chip's circuitry isn't needed, and the presence of those ââ¬Å"wastedâ⬠circuits slows things down. Suppose, instead, that the chip's circuits could be tailored specifically for the problem at handââ¬âsay, computer-aided designââ¬âand then rewired, on the fly, when you loaded a tax-preparation program. One set of chips, little bigger than a credit card, could do almost anything, even changing into a wireless phone. The market for such versatile marvels would be huge, and would translate into lower costs for users. So computer scientists are hatching a novel concept that could increase number-crunching powerââ¬âand trim costs as well. Call it the chameleon chip. Chameleon chips would be an extension of what can already be done with field-programmable gate arrays (FPGAS). An FPGA is covered with a grid of wires. At each crossover, there's a switch that can be semipermanently opened or closed by sending it a special signal. Usually the chip must first be inserted in a little box that sends the programming signals. But now, labs in Europe, Japan, and the U. S. are developing techniques to rewire FPGA-like chips anytimeââ¬âand even software that can map out circuitry that's optimized for specific problems. The chips still won't change colors. But they may well color the way we use computers in years to come. it is a fusion between custom integrated circuits and programmable logic. in the case when we are doing highly performance oriented tasks custom chips that do one or two things spectacularly rather than lot of things averagely is used. Now using field programmed chips we have chips that can be rewired in an instant. Thus the benefits of customization can be brought to the mass market. [pic]A reconfigurable processor is a microprocessor with erasable hardware that can rewire itself dynamically. This allows the chip to adapt effectively to the programming tasks demanded by the particular software they are interfacing with at any given time. Ideally, the reconfigurable processor can transform itself from a video chip to a central processing unit (cpu) to a graphics chip, for example, all optimized to allow applications to run at the highest possible speed. The new chips can be called a ââ¬Å"chip on demand. â⬠In practical terms, this ability can translate to immense flexibility in terms of device functions. For example, a single device could serve as both a camera and a tape recorder (among numerous other possibilities): you would simply download the desired software and the processor would reconfigure itself to optimize performance for that function. Reconfigurable processors, competing in the market with traditional hard-wired chips and several types of programmable microprocessors. Programmable chips have been in existence for over ten years. Digital signal processors (DSPs), for example, are high-performance programmable chips used in cell phones, automobiles, and various types of music players. Another version, programmable logic chips are equipped with arrays of memory cells that can be programmed to perform hardware functions using software tools. These are more flexible than the specialized DSP chips but also slower and more expensive. Hard-wired chips are the oldest, cheapest, and fastest ââ¬â but also the least flexible ââ¬â of all the options. Chameleon chips Highly flexible processors that can be reconfigured remotely in the field, Chameleon's chips are designed to simplify communication system design while delivering increased price/performance numbers. The chameleon chip is a high bandwidth reconfigurable communications processor (RCP). it aims at changing a system's design from a remote location. This will mean more versatile handhelds. Processors operate at 24,000 16-bit million operations per second (MOPS), 3,000 16-bit million multiply-accumulates per second (MMACS), and provide 50 channels of CDMA2000 chip-rate processing. The 0. 25-micron chip, the CS2112 is an example. These new chips are able to rewire themselves on the fly to create the exact hardware needed to run a piece of software at the utmost speed. an example of such kind of a chip is a chameleon chip. this can also be called a ââ¬Å"chip on demandâ⬠ââ¬Å"Reconfigurable computing goes a step beyond programmable chips in the matter of flexibility. It is not only possible but relatively commonplace to ââ¬Å"rewriteâ⬠the silicon so that it can perform new functions in a split second. Reconfigurable chips are simply the extreme end of programmability. â⬠The overall performance of the ACM can surpass the DSP because the ACM only constructs the actual hardware needed to execute the software, whereas DSPs and microprocessors force the software to fit its given architecture. One reason that this type of versatility is not possible today is that handheld gadgets are typically built around highly optimized specialty chips that do one thing really well. These chips are fast and relatively cheap, but their circuits are literally written in stone ââ¬â or at least in silicon. A multipurpose gadget would have to have many specialized chips ââ¬â a costly and clumsy solution. Alternately, you could use a general-purpose microprocessor, like the one in your PC, but that would be slow as well as expensive. For these reasons, chip designers are turning increasingly to reconfigurable hardwareââ¬âintegrated circuits where the architecture of the internal logic elements can be arranged and rearranged on the fly to fit particular applications. Designers of multimedia systems face three significant challenges in today's ultra-competitive marketplace: Our products must do more, cost less, and be brought to the market quicker than ever. Though each of these goals is individually attainable, the hat trick is generally unachievable with traditional design and implementation techniques. Fortunately, some new techniques are emerging from the study of reconfigurable computing that make it possible to design systems that satisfy all three requirements simultaneously. Although originally proposed in the late 1960s by a researcher at UCLA, reconfigurable computing is a relatively new field of study. The decades-long delay had mostly to do with a lack of acceptable reconfigurable hardware. Reprogrammable logic chips like field programmable gate arrays (FPGAs) have been around for many years, but these chips have only recently reached gate densities making them suitable for high-end applications. (The densest of the current FPGAs have approximately 100,000 reprogrammable logic gates. ) With an anticipated doubling of gate densities every 18 months, the situation will only become more favorable from this point forward. The primary product is a groundstation equipment for satellite communications. This application involves high-rate communications, signal processing, and a variety of network protocols and data formats. ADVANTAGES AND APPLICATIONS Its applications are in, ? data-intensive Internet ? DSP ? wireless basestations ? voice compression ? software-defined radio ? high-performance embedded telecom and datacom applications ? xDSL concentrators ? fixed wireless local loop ? multichannel voice compression ? multiprotocol packet and cell processing protocols Its advantages are ? can create customized communications signal processors ? increased erformance and channel count ? can more quickly adapt to new requirements and standards ? lower development costs and reduce risk. FPGA One of the most promising approaches in the realm of reconfigurable architecture is a technology called ââ¬Å"field-programmable gate arrays. â⬠The strategy is to build uniform arrays of thousands of logic elements, each of which can take on the personality of different, fundamental component s of digital circuitry; the switches and wires can be reprogrammed to operate in any desired pattern, effectively rewiring a chip's circuitry on demand. A designer can download a new wiring pattern and store it in the chip's memory, where it can be easily accessed when needed. Not so hard after all Reconfigurable hardware first became practical with the introduction a few years ago of a device called a ââ¬Å"field-programmable gate arrayâ⬠(FPGA) by Xilinx, an electronics company that is now based in San Jose, California. An FPGA is a chip consisting of a large number of ââ¬Å"logic cellsâ⬠. These cells, in turn, are sets of transistors wired together to perform simple logical operations. Evolving FPGAs FPGAs are arrays of logic blocks that are strung together through software commands to implement higher-order logic functions. Logic blocks are similar to switches with multiple inputs and a single output, and are used in digital circuits to perform binary operations. Unlike with other integrated circuits, developers can alter both the logic functions performed within the blocks and the connections between the blocks of FPGAs by sending signals that have been programmed in software to the chip. FPGA blocks can perform the same high-speed hardware functions as fixed-function ASICs, andââ¬âto distinguish them from ASICsââ¬âthey can be rewired and reprogrammed at any time from a remote location through software. Although it took several seconds or more to change connections in the earliest FPGAs, FPGAs today can be configured in milliseconds. Field-programmable gate arrays have historically been applied as what is called glue logic in embedded systems, connecting devices with dissimilar bus architectures. They have often been used to link digital signal processorsââ¬âcpus used for digital signal processingââ¬âto general-purpose cpus. The growth in FPGA technology has lifted the arrays beyond the simple role of providing glue logic. With their current capabilities, they clearly now can be classed as system-level components just like cpus and DSPs. The largest of the FPGA devices made by the company with which one of the authors of this article is affiliated, for example, has more than 150 billion transistors, seven times more than a Pentium-class microprocessor. Given today's time-to-market pressures, it is increasingly critical that all system-level components be easy to integrate, especially since the phase involving the integration of multiple technologies has become the most time-consuming part of a product's development cycle. To Integrating Hardware and Software systems designers producing mixed cpu and FPGA designs can take advantage of deterministic real-time operating systems (RTOSs). Deterministic software is suited for controlling hardware. As such, it can be used to efficiently manage the content of system data and the flow of such data from a cpu to an FPGA. FPGA developers can work with RTOS suppliers to facilitate the design and deployment of systems using combinations of the two technologies. FPGAs operating in conjunction with embedded design tools provide an ideal platform for developing high-performance reconfigurable computing solutions for medical instrument applications. The platform supports the design, development, and testing of embedded systems based on the C language. Integration of FPGA technology into systems using a deterministic RTOS can be streamlined by means of an enhanced application programming interface (API). The blending of hardware, firmware, application software, and an RTOS into a platform-based approach removes many of the development barriers that still limit the functionality of embedded applications. Development, profiling, and analysis tools are available that can be used to analyze computational hot spots in code and to perform low-level timing analysis in multitasking environments. One way developers can use these analytical tools is to determine when to design a function in hardware or software. Profiling enables them to quickly identify functionality that is frequently used or computationally intensive. Such functions may be prime candidates for moving from software to FPGA hardware. An integrated suite of run-time analysis tools with a run-time error checker and visual interactive profiler can help developers create higher-quality, higher-performance code in little time. An FPGA consists of an array of configurable logic blocks that implement the logical functions. In FPGA's, the logic functions performed within the logic blocks, and sending signals to the chip can alter the connections between the blocks. These blocks are similar in structure to the gate arrays used in some ASIC's, but whereas standard gate arrays are configured and fixed during manufacture, the configurable logic blocks in new FPGA's can be rewired and reprogrammed repeatedly in around a microsecond. One advantages of FPGA is that it needs small time to market Flexibility and Upgrade advantages Cheap to make . We can configure an FPGA using Very High Density Language [VHDL] Handel C Java . FPGAââ¬â¢s are used presently in Encryption Image Processing Mobile Communications . FPGAââ¬â¢s can be used in 4G mobile communication The advantages of FPGAs are that Field programmable gate arrays offer companies the possibility of develloping a chip very quickly, since a chip can be configured by software. A chip can also be reconfigured, either during execution time, or as part of an upgrade to allow new applications, simply by loading new configuration into the chip. The advantages can be seen in terms of cost, speed and power consumption. The added functionality of multi-parallelism allows one FPGA to replace multiple ASICââ¬â¢s. The applications of FPGAââ¬â¢s are in ? image processing ? encryption ? mobile communication memory management and digital signal processing ? telephone units ? mobile base stations. Although it is very hard to predict the direction this technology will take, it seems more than likely that future silicon chips will be a combination of programmable logic, memory blocks and specific function blocks, such as floating point units. It is hard to predict at this early stage, but it lo oks likely that the technology will have to change over the coming years, and the rate of change for major players in todays marketplace such as Intel, Microsoft and AMD will be crucial to their survival. The precise behaviour of each cell is determined by loading a string of numbers into a memory underneath it. The way in which the cells are interconnected is specified by loading another set of numbers into the chip. Change the first set of numbers and you change what the cells do. Change the second set and you change the way they are linked up. Since even the most complex chip is, at its heart, nothing more than a bunch of interlinked logic circuits, an FPGA can be programmed to do almost anything that a conventional fixed piece of logic circuitry can do, just by loading the right numbers into its memory. And by loading in a different set of numbers, it can be reconfigured in the twinkling of an eye. Basic reconfigurable circuits already play a huge role in telecommunications. For instance, relatively simple versions made by companies such as Xilinx and Altera are widely used for network routers and switches, enabling circuit designs to be easily updated electronically without replacing chips. In these early applications, however, the speed at which the chips reconfigure themselves is not critical. To be quick enough for personal information devices, the chips will need to completely reconfigure themselves in a millisecond or less. ââ¬Å"That kind of chameleon device would be the killer app of reconfigurable computingâ⬠These experts predict that in the next couple of years reconfigurable systems will be used in cell phones to handle things like changes in telecommunications systems or standards as users travel between calling regions ââ¬â or between countries. As it is getting more expensive and difficult to pattern, or etch, the elaborate circuitry used in microprocessors; many experts have predicted that maintaining the current rate of putting more circuits into ever smaller spaces will, sometime in the next 10 to 15 years, result in features on microchips no bigger than a few atoms, which would demand a nearly impossible level of precision in fabricating circuitry But reconfigurable chips don't need that type of precision and we can make computers that function at the nanoscale level. CS2112 (a reconfigurable processor developed by chameleon systems) RCP architecture is designed to be as flexible as an FPGA, and as easy to program as a digital signal processor (DSP), with real-time, visual debugging capability. The development environment, comprising Chameleon's C-SIDE software tool suite and CT2112SDM development kit, enables customers to develop and debug communication and signal processing systems running on the RCP. The RCP's development environment helps overcome a fundamental design and debug challenge facing communication system designers. In order to build sufficient performance, channel capacity, and flexibility into their systems, today's designers have been forced to employ an amalgamation of DSPs, FPGAs and ASICs, each of which requires a unique design and debug environment. The RCP platform was designed from the ground up to alleviate this problem: first by significantly exceeding the performance and channel capacity of the fastest DSPs; second by integrating a complete SoC subsystem, including an embedded microprocessor, PCI core, DMA function, and high-speed bus; and third by consolidating the design and debug environment into a single platform-based design system that affords the designer comprehensive visibility and control. The C-SIDE software suite includes tools used to compile C and assembly code for execution on the CS2112's embedded microprocessor, and Verilog simulation and synthesis tools used to create parallel datapath kernels which run on the CS2112's reconfigurable processing fabric. In addition to code generation tools, the package contains source-level debugging tools that support simulation and real-time debugging. Chameleon's design approach leverages the methods employed by most of today's communications system designers. The designer starts with a C program that models signal processing functions of the baseband system. Having identified the dataflow intensive functional blocks, the designer implements them in the RCP to accelerate them by 10- to 100-fold. The designer creates equivalent functions for those blocks, called kernels, in Chameleon's reconfigurable assembly language-like design entry language. The assembler then automatically generates standard Verilog for these kernels that the designer can verify with commercial Verilog simulators. Using these tools, the designer can compare testbench results for the original C functions with similar results for the Verilog kernels. In the next phase, the designer synthesises the Verilog kernels using Chameleon's synthesis tools targeting Chameleon technology. At the end, the tools output a bit file that is used to configure the RCP. The designer then integrates the application level C code with Verilog kernels and the rest of the standard C function. Chameleon's C-SIDE compiler and linker technology makes this integration step transparent to the designer. The CS2112 development environment makes all chip registers and memory locations accessible through a development console that enables full processor-like debugging, including features like single-stepping and setting breakpoints. Before actually productising the system, the designer must often perform a system-level simulation of the data flow within the context of the overall system. Chameleon's development board enables the designer to connect multiple RCPs to other devices in the system using the PCI bus and/or programmable I/O pins. This helps prove the design concept, and enables the designer to profile the performance of the whole basestation system in a real-world environment. With telecommunications OEMs facing shrinking product life cycles and increasing market pressures, not to mention the constant flux of protocols and standards, it's more necessary than ever to have a platform that's reconfigurable. This is where the chameleon chips are going to make its effect felt. The Chameleon CS2112 Package is a high-bandwidth, reconfigurable communications processor aimed at ? second- and third-generation wireless base stations fixed point wireless local loop (WLL) ? voice over IP ? DSL(digital subscriber line) ? High end dsp operations ? 2G-3G wireless base stations ? software defined radio ? security processing ââ¬Å"Traditional solutions such as FPGAs and DSPs lack the performance for high-bandwidth applications, and fixed function solutions like ASICs incur unacceptable limits Each product in the CS2000 family has the same fundamental functional blocks: a 32-bit RISC processor, a full-featured memory controller, a PCI controller, and a reconfigurable processing fabric, all of which are interconnected by a high-speed system bus. The above mentioned fabric comprises an array of reconfigurable tiles used to implement the desired algorithms. Each tile contains seven 32-bit reconfigurable datapath units, four blocks of local store memory, two 16Ãâ"24-bit multipliers, and a control logic unit. Basic Architecture [pic] Components: ? 32-bit Risc ARC processor @125MHz ? 64 bit memory controller ? 32 bit PCI controller ? reconfigurable processing fabric (RPF) ? high speed system bus ? programmable I/O (160 pins) ? DMA Subsystem ? Configuration Subsystem More on the architecture of RPF 4 Slices with 3 Tiles in each. Each tile can be reconfigured at runtime Tiles contain : â⬠¢ Datapath Units â⬠¢ Local Store Memories â⬠¢ 16Ãâ"24 multipliers â⬠¢ Control Logic Unit The C-SIDE design system is a fully integrated tool suite, with C compiler, Verilog synthesizer, full-chip simulator, as well as a debug and verification environment ââ¬â an element not readily found in ASIC and FPGA design flows, according to Chameleon. Still, reconfigurable chips represent an attempt to combine the best features of hard-wired custom chips, which are fast and cheap, and programmable logic device (PLD) chips, which are flexible and easily brought to market. Unlike PLDs, QuickSilver's reconfigurable chips can be reprogrammed every few nanoseconds, rewiring circuits so they are processing global positioning satellite signals one moment or CDMA cellular signals the next, Think of the chips as consisting of libraries with preset hardware designs and chalkboards. Upon receiving instructions from software, the chip takes a hardware component from the library (which is stored as software in memory) and puts it on the chalkboard (the chip). The chip wires itself instantly to run the software and dispatches it. The hardware can then be erased for the next cycle. With this style of computing, its chips can operate 80 times as fast as a custom chip but still consume less power and board space, which translates into lower costs. The company believes that ââ¬Å"soft silicon,â⬠or chips that can be reconfigured on the fly, can be the heart of multifunction camcorders or digital television sets. With programmable logic devices, designers use inexpensive software tools to quickly develop, simulate, and test their designs. Then, a design can be quickly programmed into a device, and immediately tested in a live circuit. The PLD that is used for this prototyping is the exact same PLD that will be used in the final production of a piece of end equipment, such as a network router, a DSL modem, a DVD player, or an automotive navigation system. The two major types of programmable logic devices are field programmable gate arrays (FPGAs) and complex programmable logic devices (CPLDs). Of the two, FPGAs offer the highest amount of logic density, the most features, and the highest performance FPGAs are used in a wide variety of applications ranging from data processing and storage, to instrumentation, telecommunications, and digital signal processing. To overcome these limitations and offer a flexible, cost-effective solution, many new entrants to the DSP market are extolling the virtues of configurable and reconfigurable DSP designs. This latest breed of DSP architectures promises greater flexibility to quickly adapt to numerous and fast-changing standards. Plus, they claim to achieve higher performance without adding silicon area, cost, design time, or power consumption. In essence, because the architecture isn't rigid, the reconfigurable DSP lets the developer tailor the hardware for a specific task, achieving the right size and cost for the target application. Moreover, the same platform can be reused for other applications. Because development tools are a critical part of this solutionââ¬âin fact, they're true enablersââ¬âthe newcomers also ensure that the tools are robust and tightly linked to the devices' flexible architectures. While providing an intuitive, integrated development environment for the designers, the manufacturers ensure affordability as well. RECONFIGURING THE ARCHITECTURE Some of the new configurable DSP architectures are reconfigurable tooââ¬âthat is, developers can modify their landscape on the fly, depending on the incoming data stream. This capability permits dynamic reconfigurability of the architecture as demanded by the application. Proponents of such chips are proclaiming an era of ââ¬Å"chip-on-demand,â⬠wherein new algorithms can be accommodated on-chip in real time via software. This eliminates the cumbersome job of fitting the latest algorithms and protocols into existing rigid hardware. A reconfigurable communications processor (RCP) can reconfigured for different processing algorithms in one clock cycle. Chameleon designers are revising the architecture to create a chip that can address a much broader range of applications. Plus, the supplier is preparing a new, more user-friendly suite of tools for traditional DSP designers. Thus, the company is dropping the term reconfigurability for the new architecture and going with a more traditional name, the streaming data processor (SDP). Though the SDP will include a reconfigurable processing fabric, it will be substantially altered, the company says. Unlike the older RCP, the new chip won't have the ARM RISC core, and it will support a much higher clock rate. Additionally, it will be implemented in a 0. 13-à µm CMOS process to meet the signal processing needs of a much broader market. Further details await the release of SDP sometime in the first quarter of 2003. While Chameleon is in the redesign mode, QuickSilver Technologies is in the test mode. This reconfigurable proponent, which prefers to call its architecture an adaptive computing machine or ACM, has realized its first silicon test chip. In fact, the tests indicate that it outperforms a hardwired, fixed-function ASIC in processing compute-intensive cdma2000 algorithms, like system acquisition, rake finger, and set maintenance. For example, the ASIC's nominal speed for searching 215 phase offsets in a basic multipath search algorithm is 3. seconds. The ACM test chip took just one second at a 25-MHz clock speed to perform the same number of searches in a cdma2000 handset. Likewise, the device accomplishes over 57,000 adaptations per second in rake-finger operation to cycle through all operations in this application every 52 à µs (Fig. 1). In the set-maintenance application, the chip is almost three times fa ster than an ASIC, claims QuickSilver. THE power of a computer stems from the fact that its behaviour can be changed with little more than a dose of new software. A desktop PC might, for example, be browsing the Internet one minute, and running a spreadsheet or entering the virtual world of a computer game the next. But the ability of a microprocessor (the chip that is at the heart of any PC) to handle such a variety of tasks is both a strength and a weaknessââ¬âbecause hardware dedicated to a particular job can do things so much faster. Recognising this, the designers of modern PCs often hand over such tasks as processing 3-D graphics, decoding and playing movies, and processing soundââ¬âthings that could, in theory, be done by the basic microprocessorââ¬âto specialist chips. These chips are designed to do their particular jobs extremely fast, but they are inflexible in comparison with a microprocessor, which does its best to be a jack-of-all-trades. So the hardware approach is faster, but using software is more flexible. At the moment, such reconfigurable chips are used mainly as a way of conjuring up specialist hardware in a hurry. Rather than designing and building an entirely new chip to carry out a particular function, a circuit designer can use an FPGA instead. This speeds up the design process enormously, because making changes becomes as simple as downloading a new configuration into the chip. Chameleon Systems also develops reconfigurable chips for the high-end telecom-switching market. RECONFIGURABLE PROCESSORS A reconfigurable processor is a microprocessor with erasable hardware that can rewire itself dynamically. This allows the chip to adapt effectively to the programming tasks demanded by the particular software they are interfacing with at any given time. Ideally, the reconfigurable processor can transform itself from a video chip to a central processing unit (cpu) to a graphics chip, for example, all optimized to allow applications to run at the highest possible speed. The new chips can be called a ââ¬Å"chip on demand. â⬠In practical terms, this ability can translate to immense flexibility in terms of device functions. For example, a single device could serve as both a camera and a tape recorder (among numerous other possibilities): you would simply download the desired software and the processor would reconfigure itself to optimize performance for that function. Reconfigurable processors, competing in the market with traditional hard-wired chips and several types of programmable microprocessors. Programmable chips have been in existence for over ten years. Digital signal processors (DSPs), for example, are high-performance programmable chips used in cell phones, automobiles, and various types of music players. While microprocessors have been the dominant devices in use for general-purpose computing for the last decade, there is still a large gap between the computational efficiency of microprocessors and custom silicon. Reconfigurable devices, such as FPGAs, have come closer to closing that gap, offering a 10x benefit in computational density over microprocessors, and often offering another potential 10x improvement in yielded functional density on low granularity operations. On highly regular computations, reconfigurable architectures have a clear superiority to traditional processor architectures. On tasks with high functional diversity, microprocessors use silicon more efficiently than reconfigurable devices. The BRASS project is developing a coupled architecture which allow a reconfigurable array and processor core to cooperate efficiently on computational tasks, exploiting the strengths of both architectures. We are developing an architecture and a prototype component that will combine a processor and a high performance reconfigurable array on a single chip. The reconfigurable array extends the usefulness and efficiency of the processor by providing the means to tailor its circuits for special tasks. The processor improves the efficiency of the reconfigurable array for irregular, general-purpose computation. We anticipate that a processor combined with reconfigurable resources can achieve a significant performance improvement over either a separate processor or a separate reconfigurable device on an interesting range of problems drawn from embedded computing applications. As such, we hope to demonstrate that this composite device is an ideal system element for embedded processing. Reconfigurable devices have proven extremely efficient for certain types of processing tasks. The key to their cost/performance advantage is that conventional processors are often limited by instruction bandwidth and execution restrictions or by an insufficient number or type of functional units. Reconfigurable logic exploits more program parallelism. By dedicating significantly less instruction memory per active computing element, reconfigurable devices achieve a 10x improvement in functional density over microprocessors. At the same time this lower memory ratio allows reconfigurable devices to deploy active capacity at a finer grained level, allowing them to realize a higher yield of their raw capacity, sometimes as much as 10x, than conventional processors. The high functional density characteristic of reconfigurable devices comes at the expense of the high functional diversity characteristic of microprocessors. Microprocessors have evolved to a highly optimized configuration with clear cost/performance advantages over reconfigurable arrays for a large set of tasks with high functional diversity. By combining a reconfigurable array with a processing core we hope to achieve the best of both worlds. While it is possible to combine a conventional processor with commercial reconfigurable devices at the circuit board level, integration radically changes the i/o costs and design point for both devices, resulting in a qualitatively different system. Notably, the lower on-chip communication costs allow efficient cooperation between the processor and array at a finer grain than is sensible with discrete designs. RECONFIGURABLE COMPUTING When we talk about reconfigurable computing weââ¬â¢re usually talking about FPGA-based system designs. Unfortunately, that doesnââ¬â¢t qualify the term precisely enough. System designers use FPGAs in many different ways. The most common use of an FPGA is for prototyping the design of an ASIC. In this scenario, the FPGA is present only on the prototype hardware and is replaced by the corresponding ASIC in the final production system. This use of FPGAs has nothing to do with reconfigurable computing. However, many system designers are choosing to leave the FPGAs as part of the production hardware. Lower FPGA prices and higher gate counts have helped drive this change. Such systems retain the execution speed of dedicated hardware but also have a great deal of functional flexibility. The logic within the FPGA can be changed if or when it is necessary, which has many advantages. For example, hardware bug fixes and upgrades can be administered as easily as their software counterparts. In order to support a new version of a network protocol, you can redesign the internal logic of the FPGA and send the enhancement to the affected customers by email. Once theyââ¬â¢ve downloaded the new logic design to the system and restarted it, theyââ¬â¢ll be able to use the new version of the protocol. This is configurable computing; reconfigurable computing goes one step further. Reconfigurable computing involves manipulation of the logic within the FPGA at run-time. In other words, the design of the hardware may change in response to the demands placed upon the system while it is running. Here, the FPGA acts as an execution engine for a variety of different hardware functions ââ¬â some executing in parallel, others in serial ââ¬â much as a CPU acts as an execution engine for a variety of software threads. We might even go so far as to call the FPGA a reconfigurable processing unit (RPU). Reconfigurable computing allows system designers to execute more hardware than they have gates to fit, which works especially well when there are parts of the hardware that are occasionally idle. One theoretical application is a smart cellular phone that supports multiple communication and data protocols, though just one a time. When the phone passes from a geographic region that is served by one protocol into a region that is served by another, the hardware is automatically reconfigured. This is reconfigurable computing at its best, and using this approach it is possible to design systems that do more, cost less, and have shorter design and implementation cycles. Reconfigurable computing has several advantages. ? First, it is possible to achieve greater functionality with a simpler hardware design. Because not all of the logic must be present in the FPGA at all times, the cost of supporting additional features is reduced to the cost of the memory required to store the logic design. Consider again the multiprotocol cellular phone. It would be possible to support as many protocols as could be fit into the available on-board ROM. It is even conceivable that new protocols could be uploaded from a base station to the handheld phone on an as-needed basis, thus requiring no additional memory. ? The second advantage is lower system cost, which does not manifest itself exactly as you might expect. On a low-volume product, there will be some production cost savings, which result from the elimination of the expense of ASIC design and fabrication. However, for higher-volume products, the production cost of fixed hardware may actually be lower. We have to think in terms of lifetime system costs to see the savings. Systems based on reconfigurable computing are upgradable in the field. Such changes extend the useful life of the system, thus reducing lifetime costs. ? The final advantage of reconfigurable computing is reduced time-to-market. The fact that youââ¬â¢re no longer using an ASIC is a big help in this respect. There are no chip design and prototyping cycles, which eliminates a large amount of development effort. In addition, the logic design remains flexible right up until (and even after) the product ships. This allows an incremental design flow, a luxury not typically available to hardware designers. You can even ship a product that meets the minimum requirements and add features after deployment. In the case of a networked product like a set-top box or cellular telephone, it may even be possible to make such enhancements without customer involvement. RECONFIGURABLE HARDWARE Traditional FPGAs are configurable, but not run-time reconfigurable. Many of the older FPGAs expect to read their configuration out of a serial EEPROM, one bit at a time. And they can only be made to do so by asserting a chip reset signal. This means that the FPGA must be reprogrammed in its entirety and that its previous internal state cannot be captured beforehand. Though these features are compatible with configurable computing applications, they are not sufficient for reconfigurable computing. In order to benefit from run-time reconfiguration, it is necessary that the FPGAs involved have some or all of the following features. The more of these features they have, the more flexible can be the system design. Deciding which hardware objects to execute and when Swapping hardware objects into and out of the reconfigurable logic Performing routing between hardware objects or between hardware objects and the hardware object framework. Of course, having software manage the reconfigurable hardware usually means having an embedded processor or microcontroller on-board. (We expect several vendors to introduce single-chip solutions that combine a CPU core and a block of reconfigurable logic by yearââ¬â¢s end. The embedded software that runs there is called the run-time environment and is analogous to the operating system that manages the execution of multiple software threads. Like threads, hardware objects may have priorities, deadlines, and contexts, etc. It is the job of the run-time environment to organize this information and make decisions based upon it. The reason we need a run-time environment at all is th at there are decisions to be made while the system is running. And as human designers, we are not available to make these decisions. So we impart these responsibilities to a piece of software. This allows us to write our application software at a very high level of abstraction. To do this, the run-time environment must first locate space within the RPU that is large enough to execute the given hardware object. It must then perform the necessary routing between the hardware objectââ¬â¢s inputs and outputs and the blocks of memory reserved for each data stream. Next, it must stop the appropriate clock, reprogram the internal logic, and restart the RPU. Once the object starts to execute, the run-time environment must continuously monitor the hardware objectââ¬â¢s status flags to determine when it is done executing. Once it is done, the caller can be notified and given the results. The run-time environment is then free to reclaim the reconfigurable logic gates that were taken up by that hardware object and to wait for additional requests to arrive from the application software. The principal benefits of reconfigurable computing are the ability to execute larger hardware designs with fewer gates and to realize the flexibility of a software-based solution while retaining the execution speed of a more traditional, hardware-based approach. This makes doing more with less a reality. In our own business we have seen tremendous cost savings, simply because our systems do not become obsolete as quickly as our competitors because reconfigurable computing enables the addition of new features in the field, allows rapid implementation of new standards and protocols on an as-needed basis, and protects their investment in computing hardware. Whether you do it for your customers or for yourselves, you should at least consider using reconfigurable computing in your next design. You may find, as we have, that the benefits far exceed the initial learning curve. And as reconfigurable computing becomes more popular, these benefits will only increase. ADVANTAGES OF RECONFIGURABILITY The term reconfigurable computing has come to refer to a loose class of embedded systems. Many system-on-a-chip (SoC) computer designs provide reconfigurability options that provide the high performance of hardware with the flexibility of software. To most designers, SoC means encapsulating one or more processing elementsââ¬âthat is, general-purpose embedded processors and/or digital signal processor (DSP) coresââ¬âalong with memory, input/output devices, and other hardware into a single chip. These versatile chips can erform many different functions. However, while SoCs offer choices, the user can choose only among functions that already reside inside the device. Developers also create ASICsââ¬âchips that handle a limited set of tasks but do them very quickly. The limitation of most types of complex hardware devicesââ¬âSoCs, ASICs, and general-purp ose cpusââ¬âis that the logical hardware functions cannot be modified once the silicon design is complete and fabricated. Consequently, developers are typically forced to amortize the cost of SoCs and ASICs over a product lifetime that may be extremely short in today's volatile technology environment. Solutions involving combinations of cpus and FPGAs allow hardware functionality to be reprogrammed, even in deployed systems, and enable medical instrument OEMs to develop new platforms for applications that require rapid adaptation to input. The technologies combined provide the best of both worlds for system-level design. Careful analysis of computational requirements reveals that many algorithms are well suited to high-speed sequential processing, many can benefit from parallel processing capabilities, and many can be broken down into components that are split between the two. With this in mind, it makes sense to always use the best technology for the job at hand. Processors are best suited to general-purpose processing and high-speed sequential processing (as are DSPs), while FPGAs excel at high-speed parallel processing. The general-purpose capability of the cpu enables it to perform system management very well, and allows it to be used to control the content of the FPGAs contained in the system. This symbiotic relationship between cpus and FPGAs also means that the FPGA can off-load computationally intensive algorithms from the cpu, allowing the processor to spend more time working on general-purpose tasks such as data analysis, and more time communicating with a printer or other equipment. Conclusion These new chips called chameleon chips are able to rewire themselves on the fly to create the exact hardware needed to run a piece of software at the utmost speed. an example of such kind of a chip is a chameleon chip. his can also be called a ââ¬Å"chip on demandâ⬠Reconfigurable computing goes a step beyond programmable chips in the matter of flexibility. It is not only possible but relatively commonplace to ââ¬Å"rewriteâ⬠the silicon so that it can perform new functions in a split second. Reconfigurable chips are simply the extreme end of programmability. â⬠Highly flexible processors that can be reconfigured remotely in the field, Chameleon's chips are des igned to simplify communication system design while delivering increased price/performance numbers. The chameleon chip is a high bandwidth reconfigurable communications processor (RCP). it aims at changing a system's design from a remote location. this will mean more versatile handhelds. Its applications are in, data-intensive Internet,DSP,wireless basestations, voice compression, software-defined radio, high-performance embedded telecom and datacom applications, xDSL concentrators,fixed wireless local loop, multichannel voice compression, multiprotocol packet and cell processing protocols. Its advantages are that it can create customized communications signal processors ,it has increased performance and channel count, and it can more quickly adapt to new requirements and standards and it has lower development costs and reduce risk. A FUTURISTIC DREAM One day, someone will make a chip that does everything for the ultimate consumer device. The chip will be smart enough to be the brains of a cell phone that can transmit or receive calls anywhere in the world. If the reception is poor, the phone will automatically adjust so that the quality improves. At the same time, the device will also serve as a handheld organizer and a player for music, videos, or games. Unfortunately, that chip doesn't exist today. It would require â⬠¢ flexibility â⬠¢ high performance â⬠¢ low power â⬠¢ and low cost But we might be getting closer. Now a new kind of chip may reshape the semiconductor landscape. The chip adapts to any programming task by effectively erasing its hardware design and regenerating new hardware that is perfectly suited to run the software at hand. These chips, referred to as reconfigurable processors, could tilt the balance of power that has preserved a decade-long standoff between programmable chips and hard-wired custom chips. These new chips are able to rewire themselves on the fly to create the exact hardware needed to run a piece of software at the utmost speed. an example of such kind of a chip is a chameleon chip. this can also be called a ââ¬Å"chip on demandâ⬠ââ¬Å"Reconfigurable computing goes a step beyond programmable chips in the matter of flexibility. It is not only possible but relatively commonplace to ââ¬Å"rewriteâ⬠the silicon so that it can perform new functions in a split second. Reconfigurable chips are simply the extreme end of programmability. â⬠If these adaptable chips can reach a cost-performance parity with hard-wired chips, customers will chuck the static hard-wired solutions. And if silicon can indeed become dynamic, then so will the gadgets of the information age. No longer will you have to buy a camera and a tape recorder. You could just buy one gadget, and then download a new function for it when you want to take some pictures or make a recording. Just think of the possibilities for the fickle consumer. Programmable logic chips, which are arrays of memory cells that can be programmed to perform hardware functions using software tools, are more flexible than DSP chips but slower and more expensive For consumers, this means that the day isn't far away when a cell phone can be used to talk, transmit video images, connect to the Internet, maintain a calendar, and serve as entertainment during travel delays ââ¬â without the need to plug in adapter hardware REFERENCES BOOKS Wei Qin Presentation , Oct 2000 (The part of the presentation regarding CS2000 is covered in this page) â⬠¢ IEEE conference on Tele-communication, 2001. WEBSITES â⬠¢ www. chameleon systems. com â⬠¢ www. thinkdigit. com â⬠¢ www. ieee. org â⬠¢ www. entecollege. com â⬠¢ www. iec. org â⬠¢ www. quicksilver technologies. com â⬠¢ www. xilinx. com ABSTRACT Chameleon chips are chips whose circuitry can be tailored specifically for the p roblem at hand. Chameleon chips would be an extension of what can already be done with field-programmable gate arrays (FPGAS). An FPGA is covered with a grid of wires. At each crossover, there's a switch that can be semipermanently opened or closed by sending it a special signal. Usually the chip must first be inserted in a little box that sends the programming signals. But now, labs in Europe, Japan, and the U. S. are developing techniques to rewire FPGA-like chips anytimeââ¬âand even software that can map out circuitry that's optimized for specific problems. The chips still won't change colors. But they may well color the way we use computers in years to come. It is a fusion between custom integrated circuits and programmable logic. n the case when we are doing highly performance oriented tasks custom chips that do one or two things spectacularly rather than lot of things averagely is used. Now using field programmed chips we have chips that can be rewired in an instant. Thus the benefits of customization can be brought to the mass market. CONTENTS ? INTRODUCTION ? CHAMELEON CHIPS ? ADVANTAGES AND APPLICATION ? FPGA ? CS2112 ? RECONFIGURING T HE ARCHITECTURE ? RECONFIGURABLE PROCESSORS ? RECONFIGURABLE COMPUTING ? RECONFIGURABLE HARDWARE ? ADVANTAGES OF RECONFIGURABILITY ? CONCLUSION [pic]
Subscribe to:
Posts (Atom)