Norges billigste bøker

Bøker utgitt av MOHAMMED ABDUL SATTAR

Filter
Filter
Sorter etterSorter Populære
  • av G. Udhayakumar
    405,-

    Chlorine is the most widely used oxidizing element for bleaching. Chlorine is also used as a disinfecting agent in many applications such as water and air treatment, food processing, etc., but it has a limited effect in killing bacteria, it produces an irritating odor and bad taste and also produces some chemical byproducts. Ozone does not generate any residues and harmful byproducts during the process. It is used in many industries such as paper mills, cement mills for cooling tower applications and also disinfecting element for food processing, water, and air treatment. The ozone is generated both naturally and artificially. The natural method generates low-level ozone concentration, so it cannot be used for industrial purpose. Artificially it can be generated in many ways such as Ultraviolet (UV) treated Corona discharge, Electrolysis, and Radiochemical method. Corona discharge method is the most commonly used method. In corona discharge method, the oxygen is supplied through two plates (called Electrodes) in the presence of high voltage. As a result, ozone is generated. The high voltage is generated by different methods. Much work is carried over in the application of ozone and ozone generation method.High voltage is generated from line voltage of 230 V, 50Hz using step up transformer. However, in this method, large size transformer is required to convert low voltage AC to high voltage AC. Therefore the supply is not compact. These problems are solved by increasing the operating frequency of the system. There were several converter configurations proposed for artificial ozone generators. All these converters were designed to produce high-frequency supply. The high frequency is generated using different inverter topologies such as fly back, forward, push-pull, half bridge and full bridge. The use of high- frequency power supplies for ozone generation offers advantages such as an increase in the power density applied to the ozonizer electrode chamber surface and increase in ozone production for a given surface area while decreasing the necessary peak voltage. The high-frequency inverter has more switching losses. It affects the system performance. The efficiency, amount of ozone production and supply power quality are still main concerns. Therefore the current research is focused on the design prototype development and testing of power supply converter with better efficiency and better ozone production. The work is also focused on supply power factor improvement.

  • av Sanjukta Badhai
    425,-

    The delivery of drug molecules in the human system for therapeutic purposes is an imperative truth in clinical research. The recent advancements in the pharmaceutical sciences have shown significant progression in the drug delivery systems. Looking back into the last decades, there is a significant requirement in the pharmaceutical area for more patient compatible and certified effective medicine distribution dosage forms in cancer and other diseases. The drug properties like absorption, solubility, bioavailability and also carrier properties (e.g., retention time at the specific target-site) are the bottleneck for effective drug distribution. The novel medicine distribution dosage is able to manage the drawbacks of the conventional drugs that provide the medication over longer duration and improve medication solubility, maintaining medicine activity, achieving target-specificity and decreasing the possibilities of side-effects etc. The aim of novel medication distribution mechanisms is to deliver the drug molecules to the target locations without medicine depletion by keeping the plasma drug amount for an extended period of time and improving patient compliances. Currently, many researchers across the globe are looking into the development of new reliable targeted drug delivery systems to improve and enhance the efficacy of the treatment by restricting side-effects. During the year 1909, Paul Ehrlich started the work for the development of targeted delivery of particles by anticipating the medication delivery procedure that would target the affected disease cells directly. Paul Ehrlich recommended that medicine distribution to be a 'miracle bullet'. According to him, target- specific drug delivery is an occasion in which a drug launching provider complex or conjugate would deliver drug molecules solely to the pre-identified cells in a particular predicted manner. The 'drug targeting' is specified as the ability to instruct the medication molecules, particularly to the sites of activity with small or absence of any type of interaction with the non-required tissues

  • av Pardeep Sangwan
    399,-

    The ground motions observed at the surface due to artificial source (exploration seismology) or natural source (earthquake seismology) are affected by source characteristics, medium properties and site conditions. The medium properties are related to the attenuation of seismic waves propagating between source and receiver. The attenuation properties of a medium affect the amplitudes of ground motions at various distances from the source. Subsurface factors attributing to the seismic wave attenuation are geometrical spreading, absorption, inter-bed multiples generated by thin layering, diffractions and the focusing or defocusing effects of the reflector curvature or velocity. Among these subsurface factors, absorption reveals geological information, which includes attenuation of both intrinsic and scattering effects. Seismic wave attenuation plays a pivotal role in exploration seismology as well as in earthquake seismology. In exploration, it helps in improving the seismic resolution for reliable interpretation of hydrocarbon prospects and can be used as an attribute for delineation of the reservoirs. In earthquake seismology, it correlates with regional heterogeneities such as major folds, faults or near-surface complexities and helps in categorizing the scale of tectonic activity in the region. Seismic waves undergo attenuation and dispersion attributing to the frictional losses or the scattering while passing through the dissipative earth. The frictional losses are often termed as intrinsic attenuation, which occurs due to the relative movement of grains or fluids in the rock matrix. The intrinsic properties of the medium such as grain type, architecture, porosity, fluid type, viscosity, permeability, saturation and pressure are usually accountable for these losses in sedimentary rocks. Scattering attenuation is because of random heterogeneities in the near surface or weathered basalts and by the layering effects such as stratigraphic filtering. Both intrinsic and scattering losses eventually lead to phase distortion, degradation of seismic resolution and poor imaging below absorptive zones.

  • av Kamal Batra
    412,-

    To sustain the need of food security, there is a strain on food producers and researchers to use the technology especially Information Technology in such a way that it should cater to all the demands by making a judicious use of the available resources. Over the years, it has been observed that, despite the availability of best varieties and associated inputs, the attainable yield of many crops is below the expected levels. To overcome this gap, proper policy may be framed using IT techniques in the area of agriculture. Therefore, Information and Communications Technology (ICT) based decision support in the area of forecast must be visualized. In agriculture, forecast and decision support system helps a farmer to get proper information provided by the area experts, making a farmer use it for proper and in advanced planning for his crop. There are lots of unknown variables/factors such as weather, crop production etc. which directly or indirectly affect the prices of the crops and can often lead to unexpected losses to the farmers. One of the major cost contributors in agriculture is irrigation water. The moisture loss due to evaporation can add to the cost of irrigation and can have an overall impact of cost efficiency in agriculture. The water losses from the primary water resources/water bodies thus, needs to be estimated accurately for planning the requirement of water resources for a particular crop. Under this situation, an attempt has been made to study ICT based forecasting (backed by Artificial Neural Network (ANN) model) and Decision Support System (DSS) in agriculture to find out the irrigation requirement of a particular crop based on the water losses due to evaporation. The ANN model-based evaporation prediction aids the decision support system to forecast the requirement of irrigation water according to the selected crop in a particular agro-ecological zone. This will not only provide the valuable information and guidance to the farmers, but it also has become a must to make agriculture as a viable business. Agriculture being complex in nature, a simple forecasting model and decision support system may not address the purpose of dynamic decision making. The web based DSS utilizing forecast model based on data mining technique (Neural Network and Hybrid model) is a viable and appropriate alternative option and are becoming indispensable to disseminate this information efficiently and effectively and may reduce the cost of cultivation of the farmers.To improve the estimation, the new data mining techniques especially Artificial Neural Networks (ANN) techniques are being used for estimating the evaporation more accurately. Recently, ANN techniques are getting more attention as compared to the traditional models as it learns from the exemplar data and predict the pattern using supervised learning.

  • av Shobana. G
    452,-

    Rapid and unprecedented cell proliferation are characteristics of the condition known as cancer. They have no boundaries and spread to any nearby body sections. Malignant and neoplasms are the two categories of Cancer classification. There are possibilities that any portion of the human body may be affected, and the cancer cells might slowly extend to other nearby organs. Many deaths across the world are being caused by cancer. Among several diseases that affect women, Cancer is considered as a disease with a high global fatality rate. Due to the slowly spreading nature of this illness, women from all social strata, whether urban or rural, are equally impacted. The human body is composed of trillions of cells, which are basic building units. Every cell has a cycle in which it grows, multiplies, ages and dies. But when there is a change in this orderly process, the abnormal cells start to proliferate, causing undesirable lumps of tissue. This type of lump formation of tissues are called tumors. Tumors can be either malignant or benign. Cancerous tumors that can be fatal are malignant, while other non- cancerous tumors are known as benign. Metastasis is a condition where the cancerous cell travels to other sections of the body to form new tumors. Often malignant tumors are considered life-threatening, while benign tumors once removed usually don't spread again. Leukemia is also a type of Cancer that do not form solid tumors. Cancer is not caused by a single factor, but multiple factors might result in cancer. Primary causes of cancer reported by scientists are genetic or hereditary conditions, where the patients have a family history. Other factors include environmental factors, exposure to high radiation, viruses, pesticides and other toxins. Risk factors vary for childhood cancer and adult cancer. Similarly, the medication and diagnostic procedure also differs for child and adult. Some of the common and known factors for this disease are active smoking, lack of physical exercise, high fat diet and usage of tobacco products. Cancer disease goes through four stages. In the first stage, the cancer hasn't significantly grown. In the second stage, the cancer has noticeably grown. The chance of the tumour spreading to other areas exists in the third stage. It would expand to more body organs in the fourth stage. The main types of cancer are Leukemia that is called blood cancer, Sarcoma that damages the connective tissues, Melanoma that damages pigmentation cells and Carcinoma that damages the organs. When the shape of the body proteins are irregular and form lumps with each other, they become Amyloid deposits

  • av Souradip Chattopadhyay
    425,-

    The ¿ows of thin ¿lm form the core of a large number of scienti¿c, technological, and engineering applications. The occurrence of such ¿ows can be observed in nature, for example on the windshield of vehicles in rainy weather. Thin ¿lm ¿ows are also found in various engineering, geophysical, and biophysical ap- plications. Speci¿c examples are nanöuidics, micröuidics, coating ¿ows, intensive processing, tear-¿lm rupture, lava ¿ows, and dynamics of continental ice sheets. Important industrial applications of thin ¿lms include nuclear fusion research - for cooling the chamber walls surrounding the plasma , complex coating ¿ows - where a thin ¿lm adheres to a moving substrate, distillation units, condensers, and heat exchangers, micröuidics, geophysical settings, such as gravity currents, mud, granular and debris ¿ows, snow avalanches, ice sheet models, lava ¿ows, biological and biophysical scenarios, such as ¿exible tubes, tear-¿lm ¿ows and many more. The dynamics of such ¿lms are quite complex and display rich behavior and this attracted many mathematicians, physicists, and engineers to the ¿eld. In the past three decades, the work in the area has progressed a lot with considerable stress on revealing the stability and dynamics of the ¿lm where the ¿ow is driven by various forces such as gravity, capillarity, thermocapillarity, centrifugation, and inter- molecular. The ¿ow may happen over structured or smooth and impermeable or slippery surfaces. The investigation approaches include modeling and analytical work, numerical simulations, and performing experiments to explain the instabilities that the ¿lm can exhibit. Direct analysis of the equations of the model of the interfacial ¿ows is a very complicated mathematical exercise due to the existence of a free, evolving interface that bounds the liquid ¿lm. The mathematical complexity emerges from a number of things: (a) The Navier-Stokes (or Stokes or Euler) equations need to be solved in changing domains; (b) In certain applications one has to solve for the temperature or electrostatic or electromagnetic ¿elds apart from the ¿uid equations; (c) Several nonlinear boundary conditions should be speci¿ed at the unknown interface(s) and (d) The solutions may not exist for all times. In fact in thin ¿lm problems, one may encounter ¿nite-time singularities accompanied by topological transitions. The breakup of liquid jets is an example of that. However, in the subsequent chapters, we shall see that it is possible to use the di¿erent length scales appearing in thin ¿lm ¿ows to our advantage. Thin ¿lms are characterized by much smaller length scales in the vertical direction as compared to those in the stream-wise direction. This gives rise to a small aspect ratio which makes the problem amenable for small amplitude perturbation expansions.

  • av Raunak Rathi
    439,-

    Text mining has emerged as one of the most important data processing activities over the last few decades. While it makes the life of millions of everyday users of digital plat- forms and applications much easier, it is a domain that also challenges researchers in numerous ways. The challenges are many fold - ranging from the volume of the data that needs to be processed, storage issues, language identi¿cation challenges and many more. The work in this thesis focuses on one particular aspect of text mining e.g. Key- word identi¿cation for a document. While this may seem to be quite a trivial activity for short passages, it is quite dif¿cult to successfully identify keywords for extremely long text documents. Doing so using an automated systems only adds to the challenge. Everyone in today's world understands the importance of data. In the context of business, data is used to analyze market trends or can be used to understand customer needs. It also helps to understand the user's perspectives and choices. There are var-ious ways that data plays a crucial role in our everyday lives. Most businesses would be bound to fail if they could not comprehend the data that was available. This data could vary from stock indices, to customer feedback, to worker sentiments and numerous other insights. Analyzing data also helps in advertisement noti¿cations or to suggest a piece of relevant information to the user. It also helps to understand the likes and dislikes of a user. It can make for a world with a better user experience in terms of an individuals needs, e.g., if a user is more interested in cricket, we can provide targeted insight to the user about cricket. A customized user experience for a user is more at- tractive than a bland user experience which is homogeneous for everyone. Everyone's needs are different from others as everyone has different perspectives and opinions. We offered examples of keyword extraction, the challenges involved and the major issues faced by designers of keyword extraction algorithms. Finally about some common application areas where keyword extraction is being used in real life scenarios. Attracting users and providing them with better services through relevant data also helps the system to understand the users' needs. A user consciously or unknowingly provides his information for use in business or expresses his views on various platforms. If a user expresses some political opinions, it helps us to tailor his experience better the next time he uses the system.

  • av Jaypal Singh Rajput
    425,-

    High blood pressure or HPT is a severe disease, and patients may have no symptoms in the early stages. Due to low awareness and without proper treatment, this may be more harmful to hypertensive patients and increase chances of cardiovascular diseases.The HPT occurs when the systolic blood pressure (SBP) is greater than 140 mmHg, or diastolic blood pressure (DBP) is greater than 80 mmHg. This is a severe medical condition arising from a force of the blood against the artery walls. It makes the blood pressure to build up in the arteries when the heart pumps oxygen-rich blood to the parts of the body. HPT may damage vital organs of the body, such as lungs, brain, heart, and kidneys. The possible reasons for HPT are less physical activity, lifestyle, smoking, stress, family history, and kidney diseases. Some common symptoms of HPT are headaches, panic attacks, dizziness, vision change, anxiety, and depression. Hence, it is pivotal issue to develop awareness, medical care, and treatment for HPT.The clinical HPT can be classified as Mild, Moderate, and Severe class. Risky cardiovascular diseases are on the rise and HPT has become more prevalent. HPT is usually diagnosed through a prolonged period of elevated arterial blood pressure. HPT causes serious complications such as stroke, heart attack, kidney failure and more in the absence of proper treatment. Consequently, the recent WHO report states that HPT leads to 9.4 million deaths annually. Many people do not even realize that they are suffering from the silent disease (HPT). Therefore, it is crucial to detect HPT in the early stage to ease the problem and avoid further complications.The traditional method to measure blood pressure is the cuff-based mercury sphyg­ momanometer. These traditional methods depend on medical specialists to get precise measurements and accurate readings. Additionally, it is not accurate and inconvenient for usage in a home environment. Moreover, HPT is a complex condition which may lead to unpredictable fluctuations in blood pressure of an individual. Analyzing and predicting the disease status via sphygmomanometer measurements may not be accurate, since the instrument only provides SBP and DBP values.

  • av Aashirwad Ajit Mahajan
    392,-

    Sound sleep is an important function of our lives and complements successful aging. Sleep quality is an important parameter of health-related quality of life in older adults and possibly correlated to continued adaptability in later life. Refreshing sleep may indicate proper functioning and can potentially address a variety of concerns in older adults. When awake, the central nervous system can lead to the accumulation of neuro-toxic waste; however, sleep can facilitate removal of these waste products. A restorative sleep is important for storing motor skills. Older adults suffering from disturbances in sleep can face its deleterious consequences during daytime function. In addition, the sleep disturbances may compromise their quality of life. A study on a population of over 9,000 older adults, 42% of the participants reported difficulty initiating and maintaining sleep with an annual incidence of 5 %. Despite such high prevalence and incidence, insomnia in older adults is frequently under-diagnosed, and as a result, it has been given little attention in spite of being a significant problem. Several epidemiologic studies attribute the causes of sleep disturbances to medical co-morbidities rather than aging. However age-related changes in sleep for the adults aged above 60 years have been documented through polysomnographic studies. When evaluated by Polysomnography (PSG), geriatric individuals have recorded increased awakenings and reduced slow-wave sleep (SWS). Increased awakenings and reduced SWS are the most distinctive changes and are most consistently reported in other studies related to age-related changes in sleep. In comparison with younger individuals, wakefulness while sleeping occurs more often in the geriatric individuals. This occurrence is also true for older adults who are in good health, indicating that the disturbances in sleep can be associated with ageing. In many cases, even healthy older adults have no complaints of sleep disturbances or any other significant medical concerns. This shows that there are changes in their sleep architecture, as compared to their younger counterparts.

  • av H. R. Vinoda Kumar
    405,-

    Breast cancer (BC) is the most common type of cancer in women (24.5%). BC can arise in different parts of the breast. Breast is divided in to three parts: lobules, ducts, and connective tissue. Lobules are gland which produces milk and ducts which carry milk to the nipples. The connective tissue surrounding everything holds together. Most of the breast cancers begin either in ducts or lobules. In women, BC is the leading cause of cancer death, followed by colorectal and lung cancer. Incidence rates of breast cancer are rising fast in transitioning countries such as South America, Africa as well as in high-income Asian countries (Japan and the Republic of Korea), where rates are historically low. In India, BC is the first most common cancer among women aged 15-45 years and accounts for 25% of all cancers in women.BC develops as a new mass or lump. It may be painless, hard-mass with irregular edges but the cancer tissue can also be soft, round, tender and be painful. In general, BC is divided into Invasive Ductal Carcinoma (IDC) and ductal carcinoma in situ (DCIS). IDC is also known as infiltrating ductal carcinoma that develops in a milk duct and invades the fibrous or fatty tissue of the breast outside of the duct. DCIS is a non-invasive condition in which abnormal cells are found in the lining of a breast duct. BC is heterogeneous disease that involves many risk factors such as age, family history, sex, estrogen, gene mutations, epigenetic mechanisms, unhealthy lifestyle etc. Majority of the cancers are caused by inherited genetic factors, including breast cancer. The genes are associated with germline and somatic mutations in breast cells that are inherited and acquired during a woman's lifetime. Based on genetic mutations breast cancer risks are classified into moderate risk, moderate to high risk and uncertain risk. The bar chart shown below explains the majorly associated genes with breast cancer with their percentage and their lifetime risk scores. The mutations in genes like BRCA1, BRCA2, TP53, PALB2, STK11 and CH3K2 initiate the development of BC. There are other initiators like hormones, chemicals, radiation etc which are either causative or supportive of genetic mutations in the development of breast cancer.

  • av D. Christina Sagaya Mary
    439,-

    Innovative testing methods using documentaries, newsletters, posters, pamphlets, and comic strips refer to non-traditional approaches to testing that leverage various types of media to engage learners and assess their understanding of a given subject. These methods are becoming increasingly popular in both educational and professional contexts, as they offer a fresh and interactive way to measure knowledge and skills.Documentaries are a great tool for testing knowledge retention, as they offer a visual and narrative-based approach to education. Learners can watch a documentary and then answer questions based on what they saw and heard. This type of testing can be used in a variety of contexts, from history classes to corporate training programs. Newsletters are another effective tool for testing understanding. These can be tailored to specific subjects and distributed to learners either digitally or in print. The format of a newsletter allows for a mix of text, images, and interactive elements, making it an engaging way to assess knowledge and comprehension.Posters and pamphlets are both visually striking and easy to distribute, making them an excellent option for testing knowledge in public settings or within larger groups. These materials can be customized to feature specific subject matter and can include quizzes or other interactive elements to assess comprehension.Finally, comic strips can be a fun and interactive way to test understanding, particularly in educational settings. Comics can be created that feature characters learning about a particular subject, with questions at the end of each strip to test learners' comprehension.Innovative testing methods using documentaries, newsletters, posters, pamphlets, and comic strips offer a fresh and engaging way to assess learners' knowledge and skills. By incorporating different types of media into the testing process, educators and trainers can create a more dynamic and effective learning experience for their learners. Traditional methods of teaching and learning of English as a second language throughthe Deductive method focusses mainly on Form. There is a need to shift from the traditional methods to Communicative methods of teaching, learning and testing to enhance the performance of the L2 Learners in the use of English as a second language in real life.Chambers et al had stated that innovation is "any idea, practice or material artefact perceived to be new by the relevant unit of adoption" .Innovative methods of teaching and learning being inductive in nature are learner centricwhere the L2 learner feels at home with the process of learning. The L2 learner is involved in the teaching, learning process through the Communicative Language Teaching Method. The process of learning could be enhanced through alternate methods of testing. The study uses Documentaries, Newsletters, Posters, Pamphlets and Comic strips as innovations for the same. The L2 learners could experience a sense of discomfort with the 'Paper and Pencil' language test. Love Joy discussed in detail the history of English language tests andthe need for an alternative testing method in the context of employing 'Portfolios' for testing Listening, Speaking, Reading and Writing skills of the L2 learner. In a similar context the study uses DN2PCs as innovative teaching, learning and testing methods to enhance the acquisition of Listening, Speaking, Reading and Writing Skills and its selective sub skills. There is also a transformation in the attitude and nature of the L2 learners towards the acquisition of English as a second language at the tertiary level.

  • av Amritha. S.
    452,-

    Breast cancer (BC) is the most common malignancy worldwide and accounts for 11.7% among all cancers in both the sexes, with an annual incidence of 2.3 million. BC is the leading malignancy in 159 of 185 countries and has an increasing trend in incidence from 1996-2018. Moreover, there is an 88% higher incidence in transitioned countries than the transitioning countries (55.9 and 29.7 per 100,000, respectively). The American cancer society has estimated an incidence of 287,850 invasive BC and 51,400 of non-invasive BC in US for the year 2022. The incidence of BC has increased not only due to the change in risk factor profiles but also because of reduced screening and delayed first treatments due COVID-19. Besides being the most prevalent, BC remains as leading cause of cancer deaths among women worldwide with a mortality rate of 684,996 [95% UI, 675,493-694,633]. Cancer Facts & Figures 2022 from the American Cancer Society predicted a mortality rate of 43,250 in the US. Although the BC incidence rates were highest in developed regions, their 5-year survival rates for localised and regional BC were 89.6% and 75.4% as opposed to the developing countries in like Costa Rica, India, Philippines, and Thailand which showed survival of 76.3% for localised and 47.4% for regional BC. BC in Indian females accounts for 8.2% of all cancers; the age standardised incidence rate of female BC increased by 39·1% (95% UI 5·1-85·5) during 1990 to 2016. India reflects a culturally diverse country with varying degrees of development, lifestyle and diet leading to a heterogeneous distribution of disease burden. As per the global burden initiative report the age-standardised incidence rate of BC varied 3·2 times across the Indian states. In India, majority of patients present at locally advanced or at metastatic stages at the time of diagnosis. The constant increase and variations in BC occurrence might be due to the lack of early detection screening strategies, low outreach of existing diagnostic measures and interplay of various risk factors associated with the disease.

  • av Ramesh A.
    399,-

    In the last few decades, energy usage and demand has been increased by many folds due to rising population growth and the industrial revolution. According to the recent United Nations report, the current world population 7.7 billion, and it may reach to 8.6 billion in the year 2030 and 9.8 billion in the year 2050. So, there is an ever-increasing energy need in the coming years, due to population growth and the industrial revolution. However, due to the increasing living standards of the population in developing countries, the energy requirement will get doubled in the next few decades. At present, nearly 80% of the global energy demand is met from fossil fuels, which are the non-renewable energy source and it leaves behind carbon footprints after usage, causing environmental pollution and climate change. On the other hand, the usage of fossil fuels emits carbon -based greenhouse gases upon usage. In the past few years, CO2 emission from fossil fuels has increased drastically, causing environmental pollution and climate change. There is an urgent need for developing materials and technologies to combat the dangerous emissions of CO2 from industrial sectors such as power generation plants, cement production, petrochemical production, aluminium, steel and plastic manufacturing industries.Hence the upcoming energy shortage and the rising concern over climate change, there is an urgent need to find clean energy alternatives. Owing to the serious concern over climate change due to the increasing greenhouse gas emissions from fossil fuel usage, environmentally friendly renewable energies are the prime choice for satisfying our future energy needs. Renewable energies on the other hand are inexhaustible and are decentralized (can be produced anywhere locally). Renewable energy is the growing energy source around the world, accounting for 40% of the increase in primary energy by 2030. Renewable energies such as solar, tidal and wind energy provide intermittent supplies of electricity, while options for large-scale storage of electricity and transportations are limited. With the rise in energy demand, energy conversion and clean energy storage technologies has become the essential approaches for the global community.

  • av Gayathri. R
    399,-

    According to WHO data, heart disease is to blame for one-third of all deaths globally each year. It is estimated that cardiovascular disease claims the lives of around 17.9 million people each year throughout the world. According to the European Cardiology Society(ECS), there are around 26 million people worldwide who have been diagnosed with cardiac illness, with an additional 3.6 million being diagnosed each year. In the first two years after diagnosis, around half of all patients with heart disease die and heart disease treatment accounts for about 3% of total health-care spending. To effectively predict heart illness, you'll need a slew of different tests. Improper forecasting may be the result of medical staff lacking sufficient expertise. It may be difficult to diagnose cancer at an early stage. The surgical treatment of heart disease is tough and this is much truer in developing countries that lack medical professionals, diagnostic equipment and other resources essential for accurate diagnosis and treatment of heart patients. It would help avoid catastrophic heart attacks and improve patient safety if cardiac failure risk could be precisely assessed. Machine learning algorithms can indeed be effective at detecting diseases provided they are properly taught with relevant data. To compare prediction models, there are publicly available datasets on heart disease. Scientists can now build the most accurate prediction model possible by combining machine learning and artificial intelligence, which are both on the rise. Cardiovascular Disease (CVD) mortality has been on the rise in both adults and children,

  • av Kiruthiga K
    459,-

    Cancer is a deadly disease. It remains as leading cause of death in every country of the world . Cancer is the first or second leading cause of death among people of age above 70 years in 295 countries, according to the recent report by World Health Organization (WHO). As the number of deaths due to cancer enhances, it has partial reflections on the mortality rates caused by other diseases like stroke and coronary heart disease. . In general, the cancer incidence and the mortality are alarmingly increasing. In 2020, an estimated 19.3 million new cases and 10 million deaths due to cancer have been reported worldwide. As of 2020, one-half of all cases and about 58.3% of cancer deaths, among men and women combined, occur in Asia, 22.8% in Europe, and 20.9% in America. About 2.26 million, accounting for 11.7% of all sites of cancer, new breast cancer cases are reported at GLOBCON 2020, in comparison with GLOBOCON 2018. Female breast cancer occupies the top position in terms of new cases, among a number of types of cancers. Cancer was once considered as a disease of the westernized and industrialized countries. Nevertheless, it has emerged as a common disease of developing and low-resource countries too. About 5-10% of breast cancer patients possess a predisposition for cancer due to genetic origins. Nevertheless, individuals carry genes that are susceptible to mutationpossess a higher risk of developing breast cancer than the other general population.Most breast cancers are observed in the ductal region (80%) and the remaining 20% originate in the lobules of the breast. (Barzaman et al., 2020). Breast cancer is categorized into three classes: (i) the one that expresses an estrogen hormone receptor (ER¿) or a progesterone receptor (PR¿), (ii) the one that expresses human epidermal receptor 2 (HER 2¿), and (iii) triple negative breast cancer (TNBC) (ER¿, PR¿, HER2¿)

  • av Kalai Vani Y. S
    425,-

    Network security is any action an organization takes to prevent malicious use or accidental damage to the network's private data, its users or their devices. The goal of network security is to keep the network running and safe for all legitimate users.Security incident response is one key aspect of maintaining organizational security. A critical task during security incident response is detecting that an incident has occurred. Detection may occur through reports from end-users and other stakeholders in the organization, throughdetection analysis performed or it may be accomplished by using anintrusion detection system. Intrusion Detection (ID) is a challenging endeavor, requiring security practitioners to have a high level of security expertise and knowledge of their systems and organizationThe demand for the ubiquitous personal communications is driving the development of new networking techniques. Information security has now become a very important aspect of data communication as people spend a large amount of time connected to a network. To improve the security of the data being transmitted various techniques are employed. This chapter presents background, problem discussion, research challenges, objectives and thesisorganization. A denial-of-service attack overwhelms a system's resources so thatitcannot respond to service requests. A DDoS attack is also an attack on system's resources, but it is launched from a large number of other host machines that are infected by malicious software controlled by the attacker. There are different types of DoS and DDoS attacks; the most common are TCP SYN flood attack, teardrop attack, smurf attack, ping-of-death attack and botnets.

  • av Ms. Amrita Singh
    446,-

    Cancer is a complex aberrated biochemical disease that highly variable in its appearance, development and symptoms differ from one patient to the other. It arises from multiple alterations in physical, environmental, metabolic, chemical and genetic factors. Cancer formation is an enduring process with controlling factors that can either enhance or delay the life hostile disease. Induction of cancer involves multistep processes leading cells to proliferate in rapidly and in uncontrolled manner. Cancer cells originate from normal cell transformation whereby they acquire capability to proliferate abnormally and finally turn into malignancy. Many of the cancer cells have ability to detach from location of their original site. Cancer cells have capability to escape immune system and proliferating away from its normal limit. It is leading cause of morbidity and mortality worldwide with an estimated 8.2 million deaths, 14 million new cases are predicted to arise from 2012 to 22 and 32.6 million people suffering from this disease. Previous data have shown that globally 8% deaths from cancer and 6 % of all death in India. In India 5,00,000 people die every year. It is an estimated that rate of cancer deaths are expected to increase with population growth and this number likely to increase about 7,00,000. The most commonly diagnosed cancers are lung cancer(1.35 million), breast(1.15million) and colorectal (1 million) cancers.

  • av Sandhya N
    419,-

    Cognitive functions of the brain are contributed by the neuro-physiological edifice of the brain. On receiving a stimulus from the environment, the response is given by the wide processing of information. Cognition is a complex system that interrelates the functionalities of language, attention, perception, psychomotor skills, memory, and executive function. If brain is healthy, it works quickly and automatically but when problems occur, it is hazardous. When the cognitive functions of the brain are impaired, various types of brain disorders may happen. Symptoms will be disease-specific. If the damage is permanent, it cannot be reversed. Otherwise surgery, medicines physical therapy can be applied to the affected area. Dementia is a clinical syndrome that is acquired, and the patient experiences deterioration from the previous functionalities in at least one of the following areas - language, visuospatial skills, executive abilities, and emotion. Dementia affects elder people but it not a result of the process of normal aging. The variants of dementia are Alzheimer's Disease, Vascular Dementia, mixed dementia, Lewy Bodies Dementia, Pronto Temporal Dementia.

  • av Amar Bharatrao Deshmukh
    425,-

    The most significant issues in recognition of face are computer vision and pattern identification where high-resolution spatial face images or videos required improving the pictographic information for human analysis and automatic machine observation depiction. Image resolution depends upon the pixel size, and it explains the details present in an image. Super Resolution (SR) has several applications such as surveillance of video, intelligent ID card and access control. The normal illumination, HR imaging and frontal view in the controlled environment conditions, the existing face identification algorithms may obtain high identification rates. The recognition of face is a challenging task in which only facial images of low resolution (LR) are accessible with high pose variations. There is a need to deal with "one sample per class problem", in which the HR images of having one frontal are present in passports, personal certificates or ID cards, etc. in the gallery. In video surveillance, the high resolution image of frontal face recognizes the face of an individual person, generally aims at identifying the low resolution of images of non-frontal face (due to large distance between camera and object) from the gallery. In-addition there are various problems in the traditional identification of face techniques, such as variation in poses; each person has only one image in the gallery and resolution differences in image etc. Hence, in real-world applications, face identification is still a difficult task while only facial images of LR are offered through large poses variations. Another important aspect which emphasizes the use of SR algorithms is that most widely used imaging sensors like Complementary Metal Oxide Semiconductor (CMOS) and Charge-Coupled Devices (CCDs) arrays, also have a limitation on the spatial resolution of the sensors. Hardware technology and the manufacturing of optics technology are not able to support the order of expected resolution also created a need to learn SR algorithms to attain the resolution improvement goal.

  • av Kumar Shashvat
    385,-

    The Vector borne diseases have become the most widespread diseases worldwide, making them a threat to the society. According to a report by WHO the vector borne diseases like malaria, dengue and chikungunya are one of the most common, epidemic diseases in most cities and states. Many scientists to develop new models for forecasting as well as the prediction of number of cases for these deadly epidemic diseases by various regression models. Every year there are thousands of deaths due to these vector borne diseases. Vector-borne diseases are human illness caused by parasites, viruses and bacteria which are transmitted by mosquitoes, sand flies, bugs, mites, nails and lice. Every year there are more than 70,000 deaths from diseases such as dengue, malaria, yellow fever and Japanese encephalitis globally. The major vector borne diseases account 17% of all the infectious diseases. The burdens of these diseases are high especially in the tropical and subtropical areas and they affect the poorest population. Since 2014, a major outbreak of dengue, malaria and Zika virus have affected the majority of populations in South Asia and overwhelmed the health systems in various countries. The determination of vector borne diseases is determined by complex demographic, environmental and social factors. Global trade, travel, unplanned urbanization and climate change can impact on pathogen transmission, making transmission season longer or causing diseases to emerge in Countries where they were previously unknown.

  • av Vishwas Mishra
    399,-

    A fundamental clench of electrical engineering tells us that the four primary circuit variables are current, voltage, charge, and flux. So, it becomes clear that there are six ways to connect these four criteria. So far, only three of these pairs, such as the resistor (R), capacitor (C), and inductor (I), have been completely mastered and regulated thus far ( L). These three passive elements, unlike active components that can create energy, can only store or dissipate energy, not generate it themselves.A linear connection between two circuit variables may be used to characterize the behavior of each of the three components. Resistors, capacitors, and inductors reflect the relationships between voltage and current, voltage and charge, and current and flux respectively. Released a framework for the memristor after discovering a physical embodiment of it. Ongoing studies have focused on developing memristor-based frameworks that are as similar to physical reality as possible. There have been a number of models to choose from, and each has its own benefits and drawbacks. Memristors were studied extensively in order to get a clearer picture of their function and attributes, as well as more realistic frameworks of the memristor's performance. SPICE models have been used for the majority of memristor frameworks to date. Both Cadence and Verilog Hardware-Description-Language (HDL) are used to model schematic circuits utilizing memristor frameworks developed in this research. Even though there had been several studies that achieved comparable qualities, including hysteresis characteristics, from 1994 to 2008, none of them were capable of joining the link. Then they were described as aberrant behavior or features of the voltage-current (V-I) system because of this. Only investigators were able to effectively and effectively tie their findings to memristor by coincidence. A team was eventually able to construct the memristor in 2008, over 4 decades since made original forecast.

  • av Animesh Kumar Agrawal
    399,-

    The digital revolution in the past decade plus has made our life much easier. The computer screens have become sleeker and the machines smarter with every passing year. The high computing power coupled with artificial intelligence has made our devices much easy to use with practically zero learning curve. However, the enhanced sophistication brings with it a unique challenge for the civilised communityi.e. how to detect and prevent cybercrimes. Cyber is touted to be the 5th dimension of war after land, sea, air and space and will be the deciding factor in future conflicts. In order to build up a strong defence, it is very important that the security breach is detected at the earliest and analysed for weak links. Towards this, forensic analysis of data acquired needs to be done to identify the perpetuator and prevent future attacks.Forensics is a branch of science that deals with the evidences that can be presented in the Court of Law. Its sub-domain that deals with acquiring and analysing data from computers, smartphones and other digital devices is known as digital forensics. Digital Forensics is the art of acquiring and analysing information obtained from a computer/digital device. The word "Smartphone Forensics" synonymous with "Mobile Forensics" has a natural reference to an Android Smartphone, primarily because of its popularity. Analysis of basic phones, smart phones and feature phones is included in Mobile Forensics. Android, Windows and iOS OS are installed in 90-95% of the mobiles used across the world. Mobile Forensics is expanding its horizon with Personal Digital Assistants (PDAs) and Tablets already in its domain and sooner or later will include Internet of Things (IoT) devices and smart TV primarily because they contain a similar OS which can store lot of data. In the nascent mobile forensics days, prime focus was on getting information somehow from the phone. This data was related to call history, contacts stored in phone book, SMS, MMS in addition to data on SIM card. However with increased usage of mobile phones, the focus has shifted on extracting data related to social media posts like Facebook, WhatsApp, Twitter, etc which are large repository of personal information. With technological advancements privacy has become a prime concern in mobiles.

  • av Feroz Shaik
    385,-

    A composites material is defined as macroscopic combination of two or more elements, also known as phase, to produce materials with aggregate properties unique from those of their constituents. One of the phases is the 'reinforcement', which provides strength and stiffness to the material, whereas the weaker phase is 'matrix', which aids in the transfer of applied loads to the Fibres while safeguarding the Fibres from the external environment improving the durability. 'Interphase' is an additional phase that exists between the two. These materials could be developed to have the best combination of features for a given application. As a result of these motives, composites play a significant role in a number of industries, including aerospace, automobile and sports. Fibres, flakes and particles can all be used as composite reinforcements. The most prevalent component in composite materials, fibres do have a substantial influence on their characteristics. Composite components in a variety of shapes may be processed and manufactured using various processes due to the wide surface area between the length and diameter of the fibres, which permits effective shear stress transmission in both the matrix and the fibres. Polymer matrix composites have been reinforced with a variety of fibres. For years, glass fibre, carbon fibre, aramid fibre and boron fibre have been utilised as reinforcement.

  • av Anshul Pareek
    446,-

    There arecountless objects present in our surrounding environment with their impressions. Vision can be better explained as a way to understand the environment that surrounds us. Even after decades, the exact working of the visual system is a mystery for the scientist involved in its investigation. When eye based vision in living creatures is replaced by computational instruments it is defined as computer vision. In other words, computer vision is artificial mimicry of vision in living creatures, where digital images and videos which are captured by cameras are further analyzed by computers obtaining an optimum level of understanding from it.Human/object tracking when done on an array of frames is an operation of tracking any mobile target object over a span of time with the help of any mobile or immobile camera. It has been a critical issue in the arena of computer vision as it is used in a number of application fields like security, surveillance, human-computer interaction, augmented reality, video communication, and compression, medical imaging, traffic control, video editing, and assistive robotics . This is a highly studied problem and remains to be a complex problem to solve. In object tracking in any given video, themajor task is to trace the target object in upcoming video frames. Object tracking is a principal segment of human-computer interaction in a real-time environment, where the computer obtains a finer model of a real-time world. For example, when autonomous vehicles are talked about, a human being cannot transmit the exact state of surroundings precisely and speedily enough.The wide-ranging scope of the application review the significance of dependable, exact, and efficacious object tracking. To obtain an effective tracking the two most important parameters to be included are first, selection of the model and secondly, the trackingmethod worthy for the task. The fundamental necessities of any tracking structure are first, a robust system, secondly, an adaptive system, and lastly, real-time processing requirement . The famous state-of-art tracking strategies are Interest point-based tracking , multiple hypothesis tracking , kernel-based tracking , and optical flow-based tracking . This area has observed a remarkable elevation due to available low cost, advanced technology cameras, and low computing complexity, corresponding to the inclination of ingenious approaches for image and video processing. Excellent reviews on the state-of-art techniques in this area have been provided in

  • av Kaustubh Bhattacharyya
    446,-

    The means of sending and receiving information without physical interaction has always been an area of interest. Communication engineers are keen to improve thequality of the process of exchange of information using wireless methods. Antennas are the most vital part of a wireless network. The reliability of information exchange through a wireless system much depends on the functionality of the antennas, both at the transmitter and receiver. Thus, the design of a near-to-perfect antenna has always been an inevitable ambition of a communication engineer. Design of microstrip anten- nas and slot antennas have remained a globally trending area of research for several decades now. Many novel approaches have evolved for designing microstrip antennas and slot antennas of di¿erent physical and electromagnetic properties at the microwave frequencies. As the need for bandwidth is increasing over the years, researchers are ex- ploring paths to design THz antennas. The already developed know-how for the design of microstrip antennas is found to be handy in this direction. Metamaterials are the arti¿cial structures whose properties are not easily found in nature. From this general- ized notion, the focus of the research and development of metamaterial can be in anypart, if the proposed arti¿cial structure follows the unique electromagnetic properties. The capability to recognize a targeted object accurately by its radar return aloneunder all environmental conditions is of great concern to automatic target recognition (ATR) systems. ATR generally refers to the use of computer processing to detect and recognize the target signature in sensor data. ATR has become increasingly important in modern defence strategy because it permits precision strikes against speci¿c tactical targets with minimized risk and maximized e¿ciency while decreasing surety damage to other objects. A key advantage of using tools such as Arti¿cial Neural Network (ANN) is that, since the algorithms automatically remember the available data, the resulting high-performance algorithms are customized to the variable data.

  • av P. Loganathan.
    345,-

    Human life is not only dependent on nature but also imagining, comparing and creating things as it exists in nature. What he has found and finds in nature is rhapsodic that one reads a lot of metaphors, and similes in literature. Although human beings are blessed with intelligence and the sixth sense, whatever man sees in nature is more beautiful and powerful than what he invents or creates. Thus nature serves as a source of human imagination and invention and man cannot refrain from glorifying it in life and literature. Ancient literature has described the wild life also. But the development of science and technology has gradually shrunk the space of nature.Meadows and pastures have disappeared; forests have receded, and rivers have dried out. The waterfalls have thinned to nothing. Ponds with flowers like lotus and fishes have been swallowed. The concrete structures have conquered all these marvellous creations of nature. The hazardous chemicals have been mixed in water, water resources, and soil. It has also risen up to cause pollution in air and atmosphere in the form of fumes, smokes and gases. When it has become a threat to human life man has started talking about environmental pollution. It has its impacts in literature also. The writers have started to write about nature oriented and environment related impacts of human action.The present day literature branches out in different genres. Writers focus on ecological themes dealing with the harm done to the environment and polluting the elements of nature. These themes deal with the environmental damage caused by human beings and its impact as a reverse action in human life and all other living beings. Writers have also taken a step forward to observe the changing climatic conditions which deprive the possibilities of living in the already civilised places and regions. It has opened a new avenue of literature and criticism that alludes tochanging climate fiction and criticism and global warming. Before discussing the subject of climate and climate fiction it is necessary to study the history of climate which geologists and scientists have researched.

  • av Jeevan Chakravarthy A. S.
    365,-

    Organosilanes have gained immense recognition worldwide due to their plethora of roles in synthetic organic chemistry in carbon-carbon bond formationreactions. The presence of silicon was initially predicted by Lavoisier in 1787. Attempts were made for the extraction of pure silicon in elemental form but remained unsuccessful. Davy proposed the name "silicium" in 1808 , assuming the element to be a metal. An impure amorphous form of silicon was first isolated by Gay-Lussac and Thenard in 1811, by heating potassium metal and silicon tetrachloride. The present name of the element was given by Thomson in 1817, by retaining part of Davy's nomenclature but adding "-on" as suffix, due to his belief that silicon was a non-metal similar to boron and carbon. However, silicon in its crystalline form was first isolated by Deville in 1854, by electrolysis of a mixture of aluminum chloride and sodium chloride containing about 10% silicon.¿Today, pure silicon is obtained in industrial scale by reduction of silica with coke in electric arc furnace. In small scales, silicon is obtained by the reduction of silicon tetraiodide,silicon tetrachloride or chlorosilane.Isolation of pure silicon is essential in industrial scales due to the importance of the element. Pure silicon is obtained by zone melting method of purification. Some of the applications of pure silicon include the preparation of electronics such as semiconductors, transistors, printed circuit boards, and integrated circuits. Silicon isused in solar panels and high power lasers.

  • av K. Munivara Prasad
    338,-

    The Internet revolution has completely changed the traditional ways of functioning of essential applications including banking, healthcare, defense, academic, and logistics. Internet-based services replaced these traditional services at a rapid pace over the past two decades. Growth in internet-dependency of individuals and entities resulted that the internet is the fundamental support for the informationworld.The emergence of new Internet-based services such as e- governance, e-procurement, and other services are contributing significantly to the global, social and economic development. With the exponential growth in Internet-based services and users worldwide, the internet infrastructure and services are facing numerous challenges related to continuous attacks.Distributed Denial of Service (DDoS) is one of the most observed attacks over internet architecture, posturing robust challenges to defensemechanisms incorporated in the framework. As the DDoS attackinformation is not made public by companies (to avoid deterioration of brand value), researchers often face the problem of the limited information available to design effective defensive strategies against DDoS attacks. The current section presents the impact of DDoS attack and the inherent vulnerability of the internet architecture. Real events of DDoS attack together with their financial impact on companies are included. Further, the need for designing an efficient DDoS defense strategy is presented in the research work.The internet resources and network systems should be readily accessible for genuine users who aim to use the services at any specific time . Unavailability of internet services and applications at the required instance is one of the major challenges restricting the spread of Internet-based services. The unavailability can results to either intentional causes or accidental causes. The basic internet framework is designed to handle accidental failures but is not efficient against intentional purposes such as intrusion, malware, hacking, etc.The Denial-of-Service (DoS) attack is categorized under intentional failures observed on the internet, which are caused by malware programmers or intruders. These attackers deny or compromise the availability of internet resources to genuine or authorized users.

  • av Praveen Kumar
    452,-

    Brain Computer Interface (BCI) decoding performance refers to the accuracy and reliability with which a BCI system can interpret and decode brain signals obtained through EEG (Electroencephalography) recordings, specifically focusing on error-related potentials (ErrPs).BCIs are innovative systems that allow direct communication and control between the human brain and external devices, such as computers or prosthetic limbs. EEG is a non-invasive technique that measures the electrical activity of the brain using electrodes placed on the scalp. It captures neural signals that represent various cognitive and motor processes.Error-related potentials (ErrPs) are specific brainwave patterns generated in response to the occurrence or anticipation of errors during cognitive tasks or feedback. They can be detected and extracted from EEG recordings and used as informative markers in BCI systems. ErrPs provide valuable insights into the user's cognitive processes, including error detection, error monitoring, and response correction.The decoding performance of a BCI system using EEG ErrPs is a crucial aspect of its effectiveness and usability. It involves the development and application of signal processing techniques, machine learning algorithms, and classification methods to accurately interpret the neural activity captured by EEG electrodes.To achieve high decoding performance, various steps are involved. These include signal preprocessing, artifact rejection, feature extraction, and selection, as well as spatial and temporal filtering techniques. Additionally, machine learning algorithms are employed to train the system to classify and interpret the extracted features. The choice of appropriate classification algorithms plays a significant role in determining the system's accuracy and real-time performance.Improving BCI decoding performance using EEG ErrPs has wide-ranging applications. It can facilitate advancements in neurofeedback, allowing individuals to enhance their cognitive control and self-regulation. BCIs using ErrPs have potential therapeutic applications in neurorehabilitation for individuals with neurological disorders or motor impairments. They can also enable assistive technology for individuals with limited motor control, providing alternative means of communication and control.Overall, the study and enhancement of BCI decoding performance using EEG ErrPs contribute to the development of more robust and reliable brain-machine interfaces, paving the way for improved human-computer interaction and the integration of neural interfaces into various fields of research and everyday life. Brain-Computer Interface (BCI) technology is poised to have a profound impact on people with severe communication and control disabilities due to locked-in syndrome (LIS). The patients suffering from LIS are paralyzed with no voluntary control over their motor movement and speech, despite intact cognition . The LIS can result from a variety of clinical conditions, including stroke, Spinal Cord Injury (SCI), motor neuron disease, most notably Amyotrophic Lateral Sclerosis (ALS), etc. BCI systems bypass the natural communication pathway between the brain and peripheral nervous system and provide an alternative pathway for communication and control. Thus, the BCI system decodes the neuronal signatures from the acquired brain signals and then translates the decoded output to control an external device.

  • av Nafees Akhter Farooqui
    425,-

    The world's largest agricultural need is high production; hence, most countries use modern techniques to boost crop yields. Advanced technology should increase yields. Other factors such as environmental stresses (pests, diseases, drought stress, nutritional deficits, and weeds) and pests affect plants at any stage. Thus, in agriculture, both quantity and quality are reduced.Crop diseases are the most important reason for quality and quantity losses in farming production. Such losses negatively affect the profit and production costs of stakeholders in farming. Conventionally, plant pathologists and farmers utilize their eyes to notice diseases and formulate decisions depending upon their knowledge that are often not precise and at times biased as in the earlier time a lot of types of diseases seems to be similar. This scheme paved the way for the needless usage of pesticides that resulted in high generation costs. Therefore, the requirement for a precise disease detector related to a consistent dataset to assist farmers is essential, particularly for the case of inexperienced and young ones . Advancements in computer vision help with the usage of ML or DL schemes. Moreover, there is a requirement for an earlier disease recognition system for protecting the yield over time. Accordingly, CNN is highly deployed in crop disease detection, and reasonable results are attained. Nevertheless, the crop disease images attained from lands were characteristically uncertain images that have a noteworthy effect on the enhancement of accuracy in crop disease recognition from images. There is a detrimental effect on agricultural output due to the prevalence of crop diseases, and increase food insecurity . The agricultural industry relies heavily on early identification of diseases, that prevention of crop diseases. Spots or scars on the leaves, stems, flowers, or fruits are common symptoms of crop diseases. Most of the time, anomalies can be diagnosed by looking for telltale signs that are specific to a given disease or pest. The leaves of crops are often the first to show signs of disease, making them an excellent starting point for diagnosis

Gjør som tusenvis av andre bokelskere

Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.