du.sePublications
Change search
Refine search result
1234567 101 - 150 of 363
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • chicago-author-date
  • chicago-note-bibliography
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 101.
    Fleyeh, Hasan
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Shi, Min
    Heifeng, Wu
    A Robust Model for Traffic Signs Recognition Based on Support Vector Machines2008In: CISP'2008, Hainan, China, 2008Conference paper (Refereed)
  • 102.
    Fleyeh, Hasan
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Shi, Min
    Wu, Haifeng
    Support vector machines for traffic signs recognition2008In: IEEE International Joint Conference on Neural Networks, 2008. IJCNN 2008. (IEEE World Congress on Computational Intelligence)., Hong Kong, 2008, p. 3820-3827Conference paper (Refereed)
    Abstract [en]

    In many traffic sign recognition system, one of the main tasks is to classify the shapes of traffic sign. In this paper, we have developed a shape-based classification model by using support vector machines. We focused on recognizing seven categories of traffic sign shapes and five categories of speed limit signs. Two kinds of features, binary image and Zernike moments, were used for representing the data to the SVM for training and test. We compared and analyzed the performances of the SVM recognition model using different feature representations and different kernels and SVM types. Our experimental data sets consisted of 350 traffic sign shapes and 250 speed limit signs. Experimental results have shown excellent results, which have achieved 100% accuracy on sign shapes classification and 99% accuracy on speed limit signs classification. The performance of SVM model highly depends on the choice of model parameters. Two search algorithms, grid search and simulated annealing search have been implemented to improve the performances of our classification model. The SVM model were also shown to be more effective than Fuzzy ARTMAP model.

  • 103.
    Fleyeh, Hasan
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Westin, Jerker
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Extracting Body Landmarks from Videos for Parkinson Gait Analysis2019Conference paper (Refereed)
    Abstract [en]

    Patients with Parkinson disease (PD) exhibit a gait disorder called festinating gait which is caused by deficiency of dopamine in the basal ganglia. To analyze gait of patients with PD, different spatiotemporal parameters such as stride length, cadence, and walking speed should be calculated. This paper aims to present a method to extract useful information represented by the positions of certain landmarks on the human body that can be used for analysis of PD patients’ gait. This method is tested using 132 videos collected from 7 PD patients and 7 healthy controls. The positions of 4 body landmarks, namely body’s center of gravity (COG), the position of the head, and the position of the feet, was computed using a total of more than 41000 of video frames. Results of object’s movement plots show high level of accuracy in the calculation of the body landmarks.

  • 104.
    Fleyeh, Hasan
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Yella, Siril
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Hansson, Karl
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Feature selection and bleach time modelling of paper pulp using tree based learners2016In: Web Information Systems Engineering – WISE 2016: 17th International Conference, Shanghai, China, November 8-10, 2016, Proceedings, Part I / [ed] Wojciech CellaryMohamed F. MokbelJianmin WangHua WangRui ZhouYanchun Zhang, China - Shanghai: Springer, 2016, Vol. 10042, p. 385-396Conference paper (Refereed)
    Abstract [en]

    Paper manufacturing is energy demanding and improvedmodelling of the pulp bleach process is the main non-invasive means ofreducing energy costs. In this paper, time it takes to bleach paper pulpto desired brightness is examined. The model currently used is analysedand benchmarked against two machine learning models (Random Forestand TreeBoost). Results suggests that the current model can be super-seded by the machine learning models and it does not use the optimalcompact subset of features. Despite the differences between the machinelearning models, a feature ranking correlation has been observed for thenew models. One novel, yet unused, feature that both machine learningmodels found to be important is the concentration of bleach agent.

  • 105.
    Fleyeh, Hasan
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Zhao, Ping
    A contour-based separation of vertically attached traffic signs2008In: 34th Annual Conference of the IEEE Industrial Electronics Society, vols 1-5, proceedings, 2008, Vol. 1-5, p. 1747-1752Conference paper (Refereed)
    Abstract [en]

    This paper presents a contour-based approach to separate vertically attached traffic signs. The algorithm is based on using binary images which are generated by any color segmentation algorithm to represent objects which could be candidate traffic signs. Since all traffic signs are similar about their vertical axis, an improved cross-correlation algorithm is invoked to determine this similarity and filters traffic sign candidates. Shape decomposition is used to smooth the contour of the candidate object iteratively in order to reduce white noise. Flipping point detection algorithm which locates black noise along the smoothed contour and the curve prediction algorithm are invoked to determine the final cut points. A separation accuracy of 94% is achieved by the algorithm. In this experiment more that 70000 images of different traffic sign combinations are invoked to achieve this result. The algorithm is tested on one-sign images, two-sign images, and three-sign images which are combined together for the purpose of testing this algorithm.

  • 106.
    Forsman, Anders
    et al.
    Dalarna University, School of Technology and Business Studies, Information Systems.
    Larsson, Hed Kerstin
    Dalarna University, School of Education, Health and Social Studies, Natural Science.
    Memedi, Mevludin
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Rosendahl, Hans
    Dalarna University, School of Technology and Business Studies, Information Systems.
    Hur kan man flippa klassrum – några exempel på "blended learning" från olika kurser på campus och distans2015Conference paper (Other academic)
    Abstract [sv]

    Syftet var att beskriva och jämföra hur vi arbetar med blended learning/flipped classroom (FC) i fyra olika kurser (campus och/eller distans) vid Högskolan Dalarna, där målet var att få syn på den egna praktiken, dela erfarenheter och inspirera varandra. Inventering av och diskussioner om hur vi arbetar med blended learning/FC och varför. De tre kurserna Forskningsmetodikkurs i informatik (campus), kurs i informatik om webbsidor (campus) och Datateknikkurs i programmering (distans) ges i princip enligt "klassiskt" FC, dvs. med inspelad föreläsning följt av diskussion vid efterföljande seminarium/motsvarande (med viss variation). I den fjärde kursen, naturvetenskap i lärarutbildningen (campus/distans), blandas inspelade och streamade föreläsningar, och laborationer och seminarier genomförs både på campus och distans (gäller både campus- och distanskurser). Vi arbetar både på liknande sätt men även olika beroende av "ämneskultur", ämnenas olika karaktär och olika kursers karaktär, men vi har ungefär samma mål: att försöka få bättre förberedda och mer aktiva studenter, dvs. försöka att gynna djupinlärning. Men möjligheterna till mer genomgripande förändringar i arbetssätt beror också av hur "öppen" ämneskulturen är för detta. En gemensam slutsats är att det är viktigt att fundera över hur man kan använda tekniken i pedagogikens tjänst för att möjliggöra/iscensätta FC. Vi upplever allihop att FC är en möjlighet att göra något på ett nytt och förhoppningsvis bättre sätt. Vi är också överens om betydelsen av att få igång studenterna och att få dem att samarbeta - kruxet är hur man kan åstadkomma detta. Nya arbetssätt ska inte medföra att studenterna sitter och tittar på när vi arbetar, om än vi gör det på ett annat sätt än vid t.ex. traditionella föreläsningar, för då är vi tillbaka i den "envägskommunikation" man vill komma bort från FC. Vi har också fått upp ögonen för att olika ämnen/kurser har olika förutsättningar/utmaningar vilket lett till att vi använder delvis olika strategier och metoder. En skillnad gentemot litteraturens beskrivningar av FC är att vi alla även använder detta i våra distanskurser, vilket ger ytterligare en dimension vad gäller utmaningar, både pedagogiskt och tekniskt, jämfört med att "flippa" på campus. Det har varit inspirerande och utvecklande att ventilera hur vi resonerar om och genomför våra olika versioner av FC, och samtalen har även medfört att vi haft möjlighet att reflektera över den egna praktiken och att spegla den i de andras praktiker, vilket både inspirerat och gett praktiska tips.

  • 107.
    Fridolfsson, Thomas
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Forensisk undersökning av Amazon Kindle2015Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This work have been done in cooperation with the Swedish Armed Forces (Försvarsmakten) and presents what possibilities there are for forensic investigations of the e-book reader Amazon Kindle. In its literature study it is described how previous research in the field is very limited. The work is therefore aiming to answer what data of forensic interest a Kindle can contain and how it can be extracted, where this information is stored and if this differs between different models and firmware versions, as well as if it's enough to investigate only the part of the memory that is available for the user or if further privileges to reach the whole memory area needs to be obtained.

    To do this, three different models of Kindles is filled with information. After that data images are taken off them, first on only the user partition and then on the whole memory area after a privilege escalation have been performed. Gathered data is analyzed and the result is presented.

    The result shows that information of forensic interest such as notes, visited web sites and documents can be found and there is therefore a value in performing forensic investigations on Amazon Kindles. There is a difference between what information that can be found and where it's stored on the different units. The units have four partitions of which only one is accessible without privilege escalation. Because of this there is an advantage of obtaining images of the whole memory area.

    In addition to the above a method for bypassing the device code of a unit and thereby getting complete access to it even though it is locked is presented.

  • 108.
    Ghiamati Yazdi, Samira
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Non-stationary Feature Extraction Techniques for Automatic Classification of Impact Acoustic Signals2008Student thesis
    Abstract [en]

    In this work, a pattern recognition technique has been proposed to automate the process of investigating the condition of wooden railway sleeper. Condition monitoring of wooden railway sleeper is mainly performed by visual inspection and also some impact acoustics tests are manually done. Though the manual procedure uses non-destructive testing methods (visual and sound analysis), decision making is largely based on intuition; moreover the process is rather slow, expensive and also requires skilled and trained staff. Impact acoustic signals have been collected from wooden railway sleepers for the purpose of achieving automation. Given the non-stationary nature of such impact acoustics signals, emphasis in this work has only been laid on non-stationary feature extraction techniques such as Short-Time Fourier Transform and Discrete Wavelet Transform. With the help of these techniques the signals can be analyzed in both time and frequency domains. Different combinations of these techniques have been tested against classifiers such as Multilayer Perceptron, Support Vector Machine and Radial Basis Function. Data fusion was investigated on mainly two levels namely feature-level and classifier-level with an aim of getting more reliable and robust results. Experimental results demonstrate that a classification accuracy of around 84% could be achieved by fusing data at the classifier level.

  • 109.
    Gilani, Syed Hassan
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Road Sign Recognition based on Invariant Features using Support Vector Machine2007Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Since last two decades researches have been working on developing systems that can assists drivers in the best way possible and make driving safe. Computer vision has played a crucial part in design of these systems. With the introduction of vision techniques various autonomous and robust real-time traffic automation systems have been designed such as Traffic monitoring, Traffic related parameter estimation and intelligent vehicles. Among these automatic detection and recognition of road signs has became an interesting research topic. The system can assist drivers about signs they don’t recognize before passing them. Aim of this research project is to present an Intelligent Road Sign Recognition System based on state-of-the-art technique, the Support Vector Machine. The project is an extension to the work done at ITS research Platform at Dalarna University [25]. Focus of this research work is on the recognition of road signs under analysis. When classifying an image its location, size and orientation in the image plane are its irrelevant features and one way to get rid of this ambiguity is to extract those features which are invariant under the above mentioned transformation. These invariant features are then used in Support Vector Machine for classification. Support Vector Machine is a supervised learning machine that solves problem in higher dimension with the help of Kernel functions and is best know for classification problems.

  • 110.
    Gunnarsson, Martin
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Utformning av optimal lösenordskomplexitet för tangenttryckningsautentisering2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This work is on the subject of

    "keystroke authentication" and identifies what type of password is best suited for the authentication method to get as consistent and accurate authentication results as possible.

    A variety of password recommendations have been compared. Based on these recommendations, password categories of different complexities were created that vary in both content and number of characters. Then, passwords were created that met the complexity requirements of the categories.

    In order to compute the categories' temporal properties and compare which category is most consistent and accurate when inputted, a web tool was developed. The tool's task was to receive password entries from selected test subjects and to check

    which keys were pressed and when the keystrokes were made. The test subjects consisted of individuals with varying personal qualities as well as varied computer skills.

    The tool registered the keycode of the pressed key as well as time stamps for each keypress and key release. These could then be used for calculation and comparison of accuracy. The accuracy was the time difference between the entries for those passwords within the same password category. Then a comparison between each password category could be made. A survey was also conducted to get an insight into the test subjects who participated in the keystroke survey.

    The result shows that passphrases consisting of 12 characters from 4 different character categories and containing at least two Swedish words provide the most consistent and accurate input result. Hence, this type of password is recommended for keystroke authentication to get as consistent and accurate authentication results as possible.

    The result also shows that special characters is the character category that affects the temporal accuracy the most. It also shows that the most accurate keystroke function varies depending on the complexity of the password used.

    All results are based on 1476 unique password entries from 41 different test subjects.

  • 111.
    Guruswamy Aarumugam, Bhupathi Rajan
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Independent Domain of Symmetric Encryption using Least SignificantBit: Computer Vision, Steganography and Cryptography Techniques2011Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The rapid development of data transfer through internet made it easier to send the data accurate and faster to the destination. There are many transmission media to transfer the data to destination like e-mails; at the same time it is may be easier to modify and misuse the valuable information through hacking. So, in order to transfer the data securely to the destination without any modifications, there are many approaches like cryptography and steganography. This paper deals with the image steganography as well as with the different security issues, general overview of cryptography, steganography and digital watermarking approaches.

     The problem of copyright violation of multimedia data has increased due to the enormous growth of computer networks that provides fast and error free transmission of any unauthorized duplicate and possibly manipulated copy of multimedia information. In order to be effective for copyright protection, digital watermark must be robust which are difficult to remove from the object in which they are embedded despite a variety of possible attacks.

    The message to be send safe and secure, we use watermarking. We use invisible watermarking to embed the message using LSB (Least Significant Bit) steganographic technique. The standard LSB technique embed the message in every pixel, but my contribution for this proposed watermarking, works with the hint for embedding the message only on the image edges alone. If the hacker knows that the system uses LSB technique also, it cannot decrypt correct message. To make my system robust and secure, we added cryptography algorithm as Vigenere square. Whereas the message is transmitted in cipher text and its added advantage to the proposed system. The standard Vigenere square algorithm works with either lower case or upper case. The proposed cryptography algorithm is Vigenere square with extension of numbers also. We can keep the crypto key with combination of characters and numbers. So by using these modifications and updating in this existing algorithm and combination of cryptography and steganography method we develop a secure and strong watermarking method.

    Performance of this watermarking scheme has been analyzed by evaluating the robustness of the algorithm with PSNR (Peak Signal to Noise Ratio) and MSE (Mean Square Error) against the quality of the image for large amount of data. While coming to see results of the proposed encryption, higher value of 89dB of PSNR with small value of MSE is 0.0017. Then it seems the proposed watermarking system is secure and robust for hiding secure information in any digital system, because this system collect the properties of both steganography and cryptography sciences.

  • 112.
    Gustavsson, Marie
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Riskfaktorer och medvetenhet om dessa vid lagringav data i molntjänster2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Molntjänster har den senaste tiden vuxit i popularitet bland annat för att företag i större utsträckning ska kunna fokusera på sin kärnverksamhet och mindre på drift av IT-system. Syftet med denna studie är att undersöka medvetenhet om risker för företag att lagra data i en molntjänst. Ett annat syfte med studien är att även belysa riskfaktorer som företag är mer och mindre medvetna om.

    För att bilda en uppfattning om medvetenheten utformades en kvantitativ webbundersökning, där företag fick svara på frågor relaterade till riskfaktorer vid lagring av data i molntjänster. Uppsatsen innehåller även en genomgång av riskfaktorer med lagring av data i en molntjänst och en mindre genomgång av säkerhetsimplementeringar som de tre molnlagringstjänsterna ”Microsoft Azure”, ”Google Cloud Storage standard” och ”Amazon S3 Standard” gjort. Riskerna och säkerhetsimplementeringarna diskuteras i jämförelse med varandra och även i en återkommande jämförelse i diskussionsavsnittet där svaren från webbenkäten analyseras.

    Deltagare i webbenkäten blev kategoriserade i två grupper, varav den ena gruppen bestod av deltagare inom IT-yrken eller med kunskap inom informationssäkerhet (15 stycken) och den andra gruppen med yrken som inte är relaterade till IT eller informationssäkerhet (16 stycken).

    Några av slutsatserna som dragits är att det finns en medvetenhet om risker i molntjänster bland dessa deltagare. En annan slutsats är att deltagare med yrken som inte är relaterade till IT eller informationssäkerhet hade sämre kunskap om molntjänster än den andra gruppen, men svaren påvisade ändå en medvetenhet om att molntjänster kan vara osäkra.

    Deltagare inom IT-yrken eller med kunskap inom informationssäkerhet hade mer kunskap om molntjänster, vilket var förväntat eftersom yrkeserfarenheten bidrar till större teknikkunskap. Något som var överraskande var att fyra av dessa deltagare hade lägre riskmedvetenhet än majoriteten av gruppen med deltagare, som inte hade yrken som är relaterade till IT eller informationssäkerhet.

    De riskfaktorer som deltagarna hade mer medvetenhet om var bland annat minskad kontroll över datat, att människor är bidragande riskfaktorer, att datat och då framförallt känsliga uppgifter blir mer tillgängligt för obehöriga. De riskfaktorer som är mindre kända för deltagarna är detaljer i avtalsskrivning av molntjänster och att data kan bli modifierad eller gå förlorad.

    Litteraturen och några deltagare menar att molnleverantörerna har mer resurser och därmed bättre kapacitet att implementera säkerhetslösningar än mindre företag. Det här är också något som studien av säkerhetsimplementeringar har påvisat. Deltagarnas svar vittnar också om att även de tror att det är andra omständigheter än självaste molnleverantören som gör att molnmiljön blir osäker.

  • 113.
    Gutta, Gayatri
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Machine vision for the automatic classification of images acquired from Non-destructive tests2007Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    This project is based on Artificial Intelligence (A.I) and Digital Image processing (I.P) for automatic condition monitoring of sleepers in the railway track. Rail inspection is a very important task in railway maintenance for traffic safety issues and in preventing dangerous situations. Monitoring railway track infrastructure is an important aspect in which the periodical inspection of rail rolling plane is required. Up to the present days the inspection of the railroad is operated manually by trained personnel. A human operator walks along the railway track searching for sleeper anomalies. This monitoring way is not more acceptable for its slowness and subjectivity. Hence, it is desired to automate such intuitive human skills for the development of more robust and reliable testing methods. Images of wooden sleepers have been used as data for my project. The aim of this project is to present a vision based technique for inspecting railway sleepers (wooden planks under the railway track) by automatic interpretation of Non Destructive Test (NDT) data using A.I. techniques in determining the results of inspection.

  • 114.
    Habib, Mohammad Ahasan
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    QoS evaluation of Bandwidth Schedulers in IPTV Networks Offered SRD Fluid Video Traffic2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Internet protocol TV (IPTV) is predicted to be the key technology winner in the future. Efforts to accelerate the deployment of IPTV centralized model which is combined of VHO, encoders, controller, access network and Home network. Regardless of whether the network is delivering live TV, VOD, or Time-shift TV, all content and network traffic resulting from subscriber requests must traverse the entire network from the super-headend all the way to each subscriber's Set-Top Box (STB). IPTV services require very stringent QoS guarantees When IPTV traffic shares the network resources with other traffic like data and voice, how to ensure their QoS and efficiently utilize the network resources is a key and challenging issue. For QoS measured in the network-centric terms of delay jitter, packet losses and bounds on delay. The main focus of this thesis is on the optimized bandwidth allocation and smooth data transmission. The proposed traffic model for smooth delivering video service IPTV network with its QoS performance evaluation. According to Maglaris et al [5] first, analyze the coding bit rate of a single video source. Various statistical quantities are derived from bit rate data collected with a conditional replenishment inter frame coding scheme. Two correlated Markov process models (one in discrete time and one in continuous time) are shown to fit the experimental data and are used to model the input rates of several independent sources into a statistical multiplexer. Preventive control mechanism which is to be including CAC, traffic policing used for traffic control. QoS has been evaluated of common bandwidth scheduler( FIFO) by use fluid models with Markovian queuing method and analysis the result by using simulator and analytically, Which is measured the performance of the packet loss, overflow and mean waiting time among the network users.

  • 115.
    Hadji, Leila
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    A Unified Load Generator for Geographically Distributed Generation of Network Traffic2006Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    During the last decade, the Internet usage has been growing at an enormous rate which has been accompanied by the developments of network applications (e.g., video conference, audio/video streaming, E-learning, E-Commerce and real-time applications) and allows several types of information including data, voice, picture and media streaming. While end-users are demanding very high quality of service (QoS) from their service providers, network undergoes a complex traffic which leads the transmission bottlenecks. Considerable effort has been made to study the characteristics and the behavior of the Internet. Simulation modeling of computer network congestion is a profitable and effective technique which fulfills the requirements to evaluate the performance and QoS of networks. To simulate a single congested link, simulation is run with a single load generator while for a larger simulation with complex traffic, where the nodes are spread across different geographical locations generating distributed artificial loads is indispensable. One solution is to elaborate a load generation system based on master/slave architecture.

  • 116.
    Hameed, Farhan
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Rate Scheduling for HSDPA in UMTS2008Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The introduction of a new technology High Speed Downlink Packet Access (HSDPA) in the Release 5 of the 3GPP specifications raises the question about its performance capabilities. HSDPA is a promising technology which gives theoretical rates up to 14.4 Mbits. The main objective of this thesis is to discuss the system level performance of HSDPA Mainly the thesis exploration focuses on the Packet Scheduler because it is the central entity of the HSDPA design. Due to its function, the Packet Scheduler has a direct impact on the HSDPA system performance. Similarly, it also determines the end user performance, and more specifically the relative performance between the users in the cell. The thesis analyzes several Packet Scheduling algorithms that can optimize the trade-off between system capacity and end user performance for the traffic classes targeted in this thesis. The performance evaluation of the algorithms in the HSDPA system are carried out under computer aided simulations that are assessed under realistic conditions to predict the results as precise on the algorithms efficiency. The simulation of the HSDPA system and the algorithms are coded in C/C++ language

  • 117.
    Han, Mengjie
    et al.
    Dalarna University, School of Technology and Business Studies, Statistics.
    Håkansson, Johan
    Dalarna University, School of Humanities and Media Studies, Cultural Studies.
    Rebreyend, Pascal
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    How do different densities in a network affect the optimal location of service centers?2013Report (Other academic)
    Abstract [en]

    The p-median problem is often used to locate p service centers by minimizing their distances to a geographically distributed demand (n). The optimal locations are sensitive to geographical context such as road network and demand points especially when they are asymmetrically distributed in the plane. Most studies focus on evaluating performances of the p-median model when p and n vary. To our knowledge this is not a very well-studied problem when the road network is alternated especially when it is applied in a real world context. The aim in this study is to analyze how the optimal location solutions vary, using the p-median model, when the density in the road network is alternated. The investigation is conducted by the means of a case study in a region in Sweden with an asymmetrically distributed population (15,000 weighted demand points), Dalecarlia. To locate 5 to 50 service centers we use the national transport administrations official road network (NVDB). The road network consists of 1.5 million nodes. To find the optimal location we start with 500 candidate nodes in the network and increase the number of candidate nodes in steps up to 67,000. To find the optimal solution we use a simulated annealing algorithm with adaptive tuning of the temperature. The results show that there is a limited improvement in the optimal solutions when nodes in the road network increase and p is low. When p is high the improvements are larger. The results also show that choice of the best network depends on p. The larger p the larger density of the network is needed. 

  • 118.
    Han, Mengjie
    et al.
    Dalarna University, School of Technology and Business Studies, Statistics.
    Håkansson, Johan
    Dalarna University, School of Technology and Business Studies, Information Systems.
    Rebreyend, Pascal
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    How does data quality in a network affect heuristic solutions?2014Report (Other academic)
    Abstract [en]

    To have good data quality with high complexity is often seen to be important. Intuition says that the higher accuracy and complexity the data have the better the analytic solutions becomes if it is possible to handle the increasing computing time. However, for most of the practical computational problems, high complexity data means that computational times become too long or that heuristics used to solve the problem have difficulties to reach good solutions. This is even further stressed when the size of the combinatorial problem increases. Consequently, we often need a simplified data to deal with complex combinatorial problems. In this study we stress the question of how the complexity and accuracy in a network affect the quality of the heuristic solutions for different sizes of the combinatorial problem. We evaluate this question by applying the commonly used

    p-median model, which is used to find optimal locations in a network of p supply points that serve n demand points. To evaluate this, we vary both the accuracy (the number of nodes) of the network and the size of the combinatorial problem (p).

    The investigation is conducted by the means of a case study in a region in Sweden with an asymmetrically distributed population (15,000 weighted demand points), Dalecarlia. To locate 5 to 50 supply points we use the national transport administrations official road network (NVDB). The road network consists of 1.5 million nodes. To find the optimal location we start with 500 candidate nodes in the network and increase the number of candidate nodes in steps up to 67,000 (which is aggregated from the 1.5 million nodes). To find the optimal solution we use a simulated annealing algorithm with adaptive tuning of the temperature. The results show that there is a limited

    improvement in the optimal solutions when the accuracy in the road network increase and the combinatorial problem (low

    p) is simple. When the combinatorial problem is complex (large p) the improvements of increasing the accuracy in the road network are much larger. The results also show that choice of the best accuracy of the network depends on the complexity of the combinatorial (varying p) problem.

  • 119.
    Han, Mengjie
    et al.
    Dalarna University, School of Technology and Business Studies, Statistics.
    Håkansson, Johan
    Dalarna University, School of Technology and Business Studies, Human Geography.
    Rebreyend, Pascal
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    How does the use of different road networks effect the optimal location of facilities in rural areas?2012Report (Other academic)
    Abstract [en]

    The p-median problem is often used to locate P service facilities in a geographically distributed population. Important for the performance of such a model is the distance measure.

    Distance measure can vary if the accuracy of the road network varies. The rst aim in this study is to analyze how the optimal location solutions vary, using the p-median model, when the road network is alternated. It is hard to nd an exact optimal solution for p-median problems. Therefore, in this study two heuristic solutions are applied, simulating annealing and a classic heuristic. The secondary aim is to compare the optimal location solutions using dierent algorithms for large p-median problem. The investigation is conducted by the means of a case study in a rural region with an asymmetrically distributed population, Dalecarlia.

    The study shows that the use of more accurate road networks gives better solutions for optimal location, regardless what algorithm that is used and regardless how many service facilities that is optimized for. It is also shown that the simulated annealing algorithm not just is much faster than the classic heuristic used here, but also in most cases gives better location solutions.

  • 120.
    Hansson, Karl
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Yella, Siril
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Dougherty, Mark
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Fleyeh, Hasan
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Machine Learning Algorithms in Heavy Process Manufacturing2016In: American Journal of Intelligent Systems, ISSN 2165-8978, E-ISSN 2165-8994, Vol. 6, no 1, p. 1-13Article in journal (Refereed)
    Abstract [en]

    In a global economy, manufacturers mainly compete with cost efficiency of production, as the price of raw materials are similar worldwide. Heavy industry has two big issues to deal with. On the one hand there is lots of data which needs to be analyzed in an effective manner, and on the other hand making big improvements via investments in cooperate structure or new machinery is neither economically nor physically viable. Machine learning offers a promising way for manufacturers to address both these problems as they are in an excellent position to employ learning techniques with their massive resource of historical production data. However, choosing modelling a strategy in this setting is far from trivial and this is the objective of this article. The article investigates characteristics of the most popular classifiers used in industry today. Support Vector Machines, Multilayer Perceptron, Decision Trees, Random Forests, and the meta-algorithms Bagging and Boosting are mainly investigated in this work. Lessons from real-world implementations of these learners are also provided together with future directions when different learners are expected to perform well. The importance of feature selection and relevant selection methods in an industrial setting are further investigated. Performance metrics have also been discussed for the sake of completion.

  • 121.
    Hasan, Mohammad Rashedul
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Separation and Execution of graphical engine on a cross platform IDE to enhance performance2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    “Biosim” is a simulation software which works to simulate the harvesting system. This system is able to design a model for any logistic problem with the combination of several objects so that the artificial system can show the performance of an individual model. The system will also describe the efficiency, possibility to be chosen for real life application of that particular model. So, when any one wish to setup a logistic model like- harvesting system, in real life he/she may be noticed about the suitable prostitution for his plants and factories as well as he/she may get information about the least number of objects, total time to complete the task, total investment required for his model, total amount of noise produced for his establishment in advance. It will produce an advance over view for his model. But “Biosim” is quite slow .As it is an object based system, it takes long time to make its decision. Here the main task is to modify the system so that it can work faster than the previous. So, the main objective of this thesis is to reduce the load of “Biosim” by making some modification of the original system as well as to increase its efficiency. So that the whole system will be faster than the previous one and performs more efficiently when it will be applied in real life. The concept is to separate the execution part of ”Biosim” form its graphical engine and run this separated portion in a third generation language platform. C++ is chosen here as this external platform. After completing the proposed system, results with different models have been observed. The results show that, for any type of plants of fields, for any number of trucks, the proposed system is faster than the original system. The proposed system takes at least 15% less time “Biosim”. The efficiency increase with the complexity of than the original the model. More complex the model, more efficient the proposed system is than original “Biosim”. Depending on the complexity of a model, the proposed system can be 56.53 % faster than the original “Biosim”.

  • 122.
    Hedlund, Niklas
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    IT­-Forensisk undersökning av flyktigt minne: På Linux och Android enheter2013Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The ability to be able to make a efficient investigation of volatile memory is something that getsmore and more important in IT forensic investigations. Partially for Linux and Windows based PCsystems but also for mobile devices in the form of the Android or devices based on other mobileoperative systems.Android uses a modified Linux kernel where the modifications exclusively are to adapt it to thedemands that exists in a operative system targeting mobile devices. These modifications containsmessage passing systems between processes as well as changes to the memory subsystems in theaspect of handling and monitoring.Since these two kernels are so closely related it is possible to use the same basic principles for dum-ping and analysing of the memory. The actual memory dumping is done by a kernel module whichin this report is done by the software called LiME which handles both kernels very well.Tools used to analyse the memory needs to understand the memory layout used on the systemin question, depending on the type of analyse method used it might also need information aboutthe different symbols involved. The tool used in this project is called Volatility which in theory iscapable of extracting all the information needed in order to make a correct investigation.The purpose was to expand on existing methods for analysing volatile memory on Linux-basedsystems, in the form of PC machines as well as embedded systems like Android. Difficulties arisedwhen the analysing of volatile memory for Android could not be completed according to existinggoals. The final result came to show that memory analysis targeting the PC platform is bothsimpler and more straight forward then what it is if Android is involved.

  • 123.
    Hellsten, Maria
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Utvärdering av forensiska verktyg och teknikerför handhållna enheter2015Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    In recent years, the market for mobile devices has increased and technological developments haveadvanced in a rapid pace. More and more communication is done through mobile and handhelddevices. Therefore, even the seizures increase by these devices, because there is often very valuableforensic information stored on them. The purpose of this work is based on a method combining aqualitative and a quantitative study examining forensic tools and techniques for handheld devices.The paper analyzed forensics description of chip-off technology and its implementation, as well asCellebrites different extraction methods are tested to see what information that is possible to extractwith each method.To seek answers, literature studies, own investigations and semi-structured interviews have beenconducted. The report is divided into different sections, where the current operating systemsAndroid and iOS, Cellebrites tools and chip-off technique are described, own investigations made,the interviews carried out and the results thereof. The results show that the type of information thatis possible to obtain, differs depending on the extraction method. Chip-off technology allows a fullphysical extraction on handheld devices that can be locked and broken, but requires some training,experience and specialized equipment.

  • 124.
    Hellström, Niklas
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Märkskyltprogram för mättransformatorer2006Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Detta examensarbete utfördes på ABB Power Technologies i Ludvika med syfte att förenkla tillverkningen av märkskyltar till mättransformatorer. Tidigare utformades alla skyltar för hand i Excel utifrån data från ett beräkningsprogram. Det gjorde att det behövdes en mall för varje variant i varje språk vilket ledde till väldigt många mallar. På ABB ville man både komma ifrån mängden olika mallar och det tidskrävande handarbetet som dessutom kunde resultera i att skyltarna inte fick enhetligt utseende. Genom att göra ett program som automatiskt genererar skyltarna utifrån en standardmall med hjälp av en fil från beräkningsprogrammet sparas mycket tid och arbete. Den som utformar skylten behöver då bara lägga till några få data själv och kan välja språk direkt i programmet istället för att behöva skriva allt för hand. Resultatet av detta examensarbete blev ett program som används av ABB för att göra skyltar till nästan alla mättransformatorer.

  • 125.
    Hellström, Peter
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Kauppi, Arvid
    Wikström, Johan
    Andersson, Arne W
    Sandblad, Bengt
    Kvist, Tomas
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Gideon, Anders
    Experimental evaluation of decision support tools for train traffic control2003In: The 6th World Congress on Railway Research (WCRR), Edinburgh, 2003Conference paper (Refereed)
  • 126.
    Helsing, Daniel
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Penetrationstest av Gagnefs Kommun2011Independent thesis Basic level (degree of Bachelor)Student thesis
  • 127.
    Hermansson, Daniel
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Integration av webbaserat analysprogram och kartprogram: Integration of web based analysis-software and map-software2007Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    I takt med att GIS (Grafiska InformationsSystem) blir allt vanligare och mer användarvänligt har WM-data sett att kunder skulle ha intresse i att kunna koppla information från sin verksamhet till en kartbild. Detta för att lättare kunna ta till sig informationen om hur den geografiskt finns utspridd över ett område för att t.ex. ordna effektivare tranporter. WM-data, som det här arbetet är utfört åt, avser att ta fram en prototyp som sedan kan visas upp för att påvisa för kunder och andra intressenter att detta är möjligt att genomföra genom att skapa en integration mellan redan befintliga system. I det här arbetet har prototypen tagits fram med skogsindustrin och dess lager som inriktning. Befintliga program som integrationen ska skapas mellan är båda webbaserade och körs i en webbläsare. Analysprogrammet som ska användas heter Insikt och är utvecklat av företaget Trimma, kartprogrammet heter GIMS som är WM-datas egna program. Det ska vara möjligt att i Insikt analysera data och skapa en rapport. Den ska sedan skickas till GIMS där informationen skrivs ut på kartan på den plats som respektive information hör till. Det ska även gå att välja ut ett eller flera områden i kartan och skicka till Insikt för att analysera information från enbart de utvalda områdena. En prototyp med önskad funktionalitet har under arbetets gång tagits fram, men för att ha en säljbar produkt är en del arbeta kvar. Prototypen har visats för ett antal intresserade som tyckte det var intressant och tror att det är något som skulle kunna användas flitigt inom många områden.

  • 128.
    Hossain, Mohammad Forhad
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Spanning Tree Approach On The Snow Cleaning Problem2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Snow cleaning is one of the important tasks in the winter time in Sweden. Every year government spends huge amount money for snow cleaning purpose. In this thesis we generate a shortest road network of the city and put the depots in different place of the city for snow cleaning. We generate shortest road network using minimum spanning tree algorithm and find the depots position using greedy heuristic. When snow is falling, vehicles start work from the depots and clean the snow all the road network of the city. We generate two types of model. Models are economic model and efficient model. Economic model provide good economical solution of the problem and it use less number of vehicles. Efficient model generate good efficient solution and it take less amount of time to clean the entire road network.

  • 129.
    Hou, Jiaqi
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    An Upgrade of Network Traffic Recognition System for SIP/VoIP Traffic Recognition2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The purpose of this project is to update the tool of Network Traffic Recognition System (NTRS) which is proprietary software of Ericsson AB and Tsinghua University, and to implement the updated tool to finish SIP/VoIP traffic recognition. Basing on the original NTRS, I analyze the traffic recognition principal of NTRS, and redesign the structure and module of the tool according to characteristics of SIP/VoIP traffic, and then finally I program to achieve the upgrade. After the final test with our SIP data trace files in the updated system, a satisfactory result is derived. The result presents that our updated system holds a rate of recognition on a confident level in the SIP session recognition as well as the VoIP call recognition. In the comparison with the software of Wireshark, our updated system has a result which is extremely close to Wireshark’s output, and the working time is much less than Wireshark. In the aspect of practicability, the memory overflow problem is avoided, and the updated system can output the specific information of SIP/VoIP traffic recognition, such as SIP type, SIP state, VoIP state, etc. The upgrade fulfills the demand of this project.

  • 130.
    HUSSAIN, TANVEER
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Transportation of Raw Material: Optimization of Production System and Reliability2010Independent thesis Basic level (degree of Bachelor)Student thesis
  • 131.
    Hägg, Andreas
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Säker hantering av mobila enheter i organisationers IT-miljö2014Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Marknaden för mobila enheter och mobilitet har haft en kraftig utveckling de

    senaste åren och den tekniska utvecklingen har gått snabbt fram. Grunden till en

    ökad mobilitet inom förtaget ligger i att den ofta förväntas ge positiva effekter i

    ökad affärsnytta och produktivitet hos de anställda. Det är ändå viktigt att komma

    ihåg att det även finns risker med en ökad mobilitet. Risker som organisationer bör

    ta hänsyn till. Tester gjorda i detta arbete visar att det finns mycket information

    lagrad på mobila enheter. Delar av denna information kan även vara mycket

    känslig information.

    Smartphones och andra mobila enheters karakteristiska egenskaper, främst att de

    är små och bärbara, gör att de är extra sårbara för ett flertal risker och hot. De

    största hotet mot säkerheten i samband med mobila enheter anses vara risken för

    dataförlust. Denna förlust sker ofta i samband med att mobila enheter avvecklas,

    stjäls eller tappas bort. Ett annat stort hot är att information stjäls av eller sprids på

    grund av att skadlig programkod (malware) finns på enheten.

    För att komma till rätta med dessa risker och hot så behöver man titta på hur

    organisationen ska hantera situationen. Det finns på marknaden säkerhetslösningar

    för att hantera många delar av säkerhetsproblematiken. Mobile Device

    Management (MDM) är en funktionalitet som handlar om att skapa kontroll på

    tillgångar och information samt möjliggöra en säkrare användning av mobila

    enheter. Ett benchmarktest och jämförelseanalys av ett urval av marknadsledande

    MDM verktyg visar på att det finns ett gott stöd för flera säkerhetshöjande

    funktioner.

    AirWatch, MobileIron och XenMobile är tre olika MDM verktyg som alla ger ett

    bra stöd för standardiserade säkerhetshöjande funktioner för mobila enheter. Att

    göra ett bra val av verktyg kräver dock ett förarbete. Detta är nödvändigt för att

    skapa rätta förutsättningar för ett val som stödjer organisationens behov, krav och

    strategier.

  • 132.
    Höglund, Rikard
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Anonymitet på Internet: Tor och steganografi i nätverkstrafik2014Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
  • 133.
    Ignatius, Per
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Säkerhetstestning av webbapplikationer och CMS plattformen EPiServer2011Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Arbetet behandlar säkerhetstestning av webbapplikationer och CMS plattformen EPiServer. För att Know IT Dalarna ska kunna fortsätta leverera säkra webblösningar efterfrågar de en säkerhetsanalys över plattformen EPiServer men även över sina egenutvecklade applikationer. Syftet med arbetet var att höja säkerheten kring Know ITs webbaserade projekt och samtidigt göra utvecklarna mer medvetna om säkerheten vid utvecklingsfasen. Resultatet var att EPiServer som plattformen tillhandahåller en fullgod säkerhet. De direkta brister som identifierades var upp till antingen Know IT eller kunden att åtgärda och ansvaret lades på den som hade hand om driften av webbplatsen. Säkerhetstesterna som utfördes var bland annat tester emot åtkomsthantering, avlyssningsattacker, lösenordsattacker, SQL-injections och XSS-attacker. För att förenkla säkerhetstestningen skapades en checklista innehållandes steg för steg för att göra en grundläggande säkerhetstestning. Den innehöll även rekommendationer till Know IT Dalarna på områden som ska belysas och undersökas i framtiden. Checklistan kan användas av utvecklarna för att säkerställa att ett pågående projekt håller en bra nivå säkerhetsmässigt. Listan måste i framtiden uppdateras och hållas i fas med den ständiga tekniska utvecklingen som sker på området.

  • 134.
    Ihlström, Pontus
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Vikten av säkra nätverk2011Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    The purpose of this thesis is to show how to use vulnerability testing to identify and search for security flaws in networks of computers. The goal is partly to give a casual description of different types of methods of vulnerability testing and partly to present the method and results from a vulnerability test. A document containing the results of the vulnerability test will be handed over and a solution to the found high risk vulnerabilities. The goal is also to carry out and present this work as a form of a scholarly work. The problem was to show how to perform vulnerability tests and identify vulnerabilities in the organization's network and systems. Programs would be run under controlled circumstances in a way that they did not burden the network. Vulnerability tests were conducted sequentially, when data from the survey was needed to continue the scan. A survey of the network was done and data in the form of operating system, among other things, were collected in the tables. A number of systems were selected from the tables and were scanned with Nessus. The result was a table across the network and a table of found vulnerabilities. The table of vulnerabilities has helped the organization to prevent these vulnerabilities by updating the affected computers. Also a wireless network with WEP encryption, which is insecure, has been detected and decrypted.

  • 135.
    Ilar, Pontus
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Server-Side Security for Anonymous Data Collection from Handset2014Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Digital guides are something that starts to appear more and more these days. Several digital

    guides have already been tried with mixed results. One version was developed for

    “världsarvet Falun” where a PDA was inserted into a box with speakers and a GPS and short

    videos gives information about certain locations in Falun. Another guide was developed for

    the IPhone in which static maps were displayed with information rendered by a server. This

    thesis tries to develop a server side solution for an application designed to give users a tour of

    selected so called areas of interest (aoi). The server only provides coordinates and information

    about aois and the specific points within. The solution uses a server with a database to store

    and provide information about these areas. The first prototype for the solution uses https to

    communicate with the application and SQL PDO queries to communicate with the database.

    Future work with this solution could be to upgrade the server to facilitate a larger scalability

    for the solution.

  • 136.
    Islam, Md Rashedul
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    QoS evaluation of bandwidth schedulers in IPTV networks offered SRD fluid video traffic: a simulation study2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    IPTV is now offered by several operators in Europe, US and Asia using broadcast video over private IP networks that are isolated from Internet. IPTV services rely on transmission of live (real-time) video and/or stored video. Video on Demand (VoD)and Time-shifted TV are implemented by IP unicast and Broadcast TV (BTV) and Near video on demand are implemented by IP multicast. IPTV services require QoS guarantees and can tolerate no more than 10-6 packet loss probability, 200 ms delay, and 50 ms jitter. Low delay is essential for satisfactory trick mode performance(pause, resume,fast forward) for VoD, and fast channel change time for BTV. Internet Traffic Engineering (TE) is defined in RFC 3272 and involves both capacity management and traffic management. Capacity management includes capacity planning, routing control, and resource management. Traffic management includes (1)nodal traffic control functions such as traffic conditioning, queue management, scheduling, and (2) other functions that regulate traffic flow through the network or that arbitrate access to network resources. An IPTV network architecture includes multiple networks (core network, metro network, access network and home network) that connects devices (super head-end, video hub office, video serving office, home gateway, set-top box). Each IP router in the core and metro networks implements some queueing and packet scheduling mechanism at the output link controller. Popular schedulers in IP networks include Priority Queueing (PQ), Class-Based Weighted Fair Queueing (CBWFQ), and Low Latency Queueing (LLQ) which combines PQ and CBWFQ. The thesis analyzes several Packet Scheduling algorithms that can optimize the tradeoff between system capacity and end user performance for the traffic classes. Before in the simulator FIFO,PQ,GPS queueing methods were implemented inside. This thesis aims to implement the LLQ scheduler inside the simulator and to evaluate the performance of these packet schedulers. The simulator is provided by Ernst Nordström and Simulator was built in Visual C++ 2008 environmentand tested and analyzed in MatLab 7.0 under windows VISTA.

  • 137.
    Jabbar, Hussain
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Color Segmentation using LVQ-Learning Vector Quantization2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    This thesis aims to present a color segmentation approach for traffic sign recognition based on LVQ neural networks. The RGB images were converted into HSV color space, and segmented using LVQ depending on the hue and saturation values of each pixel in the HSV color space. LVQ neural network was used to segment red, blue and yellow colors on the road and traffic signs to detect and recognize them. LVQ was effectively applied to 536 sampled images taken from different countries in different conditions with 89% accuracy and the execution time of each image among 31 images was calculated in between 0.726sec to 0.844sec. The method was tested in different environmental conditions and LVQ showed its capacity to reasonably segment color despite remarkable illumination differences. The results showed high robustness.

  • 138.
    Jayaram, M. A.
    et al.
    Siddaganga Instistute of Technology.
    Fleyeh, Hasan
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Soft computing in biometrics: a pragmatic appraisal2013In: American Journal of Intelligent Systems, ISSN 2165-8978, E-ISSN 2165-8994, Vol. 3, no 3, p. 105-112Article in journal (Refereed)
    Abstract [en]

    The ever increasing spurt in digital crimes such as image manipulation, image tampering, signature forgery, image forgery, illegal transaction, etc. have hard pressed the demand to combat these forms of criminal activities. In this direction, biometrics - the computer-based validation of a persons' identity is becoming more and more essential particularly for high security systems. The essence of biometrics is the measurement of person’s physiological or behavioral characteristics, it enables authentication of a person’s identity. Biometric-based authentication is also becoming increasingly important in computer-based applications because the amount of sensitive data stored in such systems is growing. The new demands of biometric systems are robustness, high recognition rates, capability to handle imprecision, uncertainties of non-statistical kind and magnanimous flexibility. It is exactly here that, the role of soft computing techniques comes to play. The main aim of this write-up is to present a pragmatic view on applications of soft computing techniques in biometrics and to analyze its impact. It is found that soft computing has already made inroads in terms of individual methods or in combination. Applications of varieties of neural networks top the list followed by fuzzy logic and evolutionary algorithms. In a nutshell, the soft computing paradigms are used for biometric tasks such as feature extraction, dimensionality reduction, pattern identification, pattern mapping and the like.

  • 139.
    Jayaram, M.A.
    et al.
    Siddaganga Institute of Technology.
    Fleyeh, Hasan
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Convex Hulls in Image Processing: A Scoping Review2016In: American Journal of Intelligent Systems, ISSN 2165-8978, E-ISSN 2165-8994, Vol. 6, no 2, p. 48-58Article in journal (Refereed)
    Abstract [en]

    The demands of image processing related systems are robustness, high recognition rates, capability to handle incomplete digital information, and magnanimous flexibility in capturing shape of an object in an image. It is exactly here that, the role of convex hulls comes to play. The objective of this paper is twofold. First, we summarize the state of the art in computational convex hull development for researchers interested in using convex hull image processing to build their intuition, or generate nontrivial models. Secondly, we present several applications involving convex hulls in image processing related tasks. By this, we have striven to show researchers the rich and varied set of applications they can contribute to. This paper also makes a humble effort to enthuse prospective researchers in this area. We hope that the resulting awareness will result in new advances for specific image recognition applications.

  • 140.
    Jayaram, M.A.
    et al.
    Siddaganga Institute of Technology, Tumakuru, Karnataka, India.
    Fleyeh, Hasan
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Whither Edge Computing? – A Futuristic Review2018In: International Journal of Applied Research on Information Technology and Computing, ISSN 0975-8070, Vol. 9, no 2, p. 180-188Article in journal (Refereed)
    Abstract [en]

    It is a well-known fact that the current day Internet is increasingly becoming laden with content that is bandwidth demanding due to ever-increasing number of things getting attached on a day-in and day-out basis. Hand-in-hand, mobile networks and data networks are converging into cloud computing bandwagon. Edge computing as a promising feature has already made inroads to face future requirements and to address exponential demands from cloud. This feature is all about inserting computing power and storage in the vicinity of the network edge. It is asserted that this scheme of operation brings down the data transport time, quick response times and increased availability. Edge computing brings bandwidthintensive content and latency-sensitive applications closer to the user or data source. In this paper, we explain the drivers of edge computing and have delved on various types of edge computing currently available and going to throng in near future. This paper is intended to draw a comprehensive picture of what is happening in edge currently and what would happen in the near foreseeable future.

  • 141.
    Jiang, Xiaowen
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    A fuzzy logic controller for intestinal levodopa infusion in Parkinson’s disease2010Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    The aim of this work is to evaluate the fuzzy system for different types of patients for levodopa infusion in Parkinson Disease based on simulation experiments using the pharmacokinetic-pharmacodynamic model. Fuzzy system is to control patient’s condition by adjusting the value of flow rate, and it must be effective on three types of patients, there are three different types of patients, including sensitive, typical and tolerant patient; the sensitive patients are very sensitive to drug dosage, but the tolerant patients are resistant to drug dose, so it is important for controller to deal with dose increment and decrement to adapt different types of patients, such as sensitive and tolerant patients. Using the fuzzy system, three different types of patients can get useful control for simulating medication treatment, and controller will get good effect for patients, when the initial flow rate of infusion is in the small range of the approximate optimal value for the current patient’ type.

  • 142. Johansson, D.
    et al.
    Ericsson, A.
    Johansson, A.
    Medvedev, A.
    Nyholm, D.
    Ohlsson, F.
    Senek, M.
    Spira, J.
    Thomas, Ilias
    Dalarna University, School of Technology and Business Studies, Microdata Analysis.
    Westin, Jerker
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Individualization of levodopa treatment using a microtablet dispenser and ambulatory accelerometry2018In: CNS Neuroscience & Therapeutics, ISSN 1755-5930, E-ISSN 1755-5949, Vol. 24, no 5, p. 439-447Article in journal (Refereed)
    Abstract [en]

    Aim

    This 4‐week open‐label observational study describes the effect of introducing a microtablet dose dispenser and adjusting doses based on objective free‐living motor symptom monitoring in individuals with Parkinson's disease (PD).

    Methods

    Twenty‐eight outpatients with PD on stable levodopa treatment with dose intervals of ≤4 hour had their daytime doses of levodopa replaced with levodopa/carbidopa microtablets, 5/1.25 mg (LC‐5) delivered from a dose dispenser device with programmable reminders. After 2 weeks, doses were adjusted based on ambulatory accelerometry and clinical monitoring.

    Results

    Twenty‐four participants completed the study per protocol. The daily levodopa dose was increased by 15% (112 mg, < 0.001) from period 1 to 2, and the dose interval was reduced by 12% (22 minutes, P = 0.003). The treatment adherence to LC‐5 was high in both periods. The MDS‐UPDRS parts II and III, disease‐specific quality of life (PDQ‐8), wearing‐off symptoms (WOQ‐19), and nonmotor symptoms (NMS Quest) improved after dose titration, but the generic quality‐of‐life measure EQ‐5D‐5L did not. Blinded expert evaluation of accelerometry results demonstrated improvement in 60% of subjects and worsening in 25%.

    Conclusions

    The introduction of a levodopa microtablet dispenser and accelerometry aided dose adjustments improve PD symptoms and quality of life in the short term.

  • 143. Johansson, Dongni
    et al.
    Thomas, Ilias
    Dalarna University, School of Technology and Business Studies, Microdata Analysis.
    Ericsson, Anders
    Johansson, Anders
    Medvedev, Alexander
    Memedi, Mevludin
    Nyholm, Dag
    Ohlsson, Fredrik
    Westin, Jerker
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Bergquist, Filip
    Evaluation of a sensor algorithm for motor state rating in Parkinson's disease2019In: Parkinsonism & Related Disorders, ISSN 1353-8020, E-ISSN 1873-5126Article in journal (Refereed)
    Abstract [en]

    INTRODUCTION: A treatment response objective index (TRIS) was previously developed based on sensor data from pronation-supination tests. This study aimed to examine the performance of TRIS for medication effects in a new population sample with Parkinson's disease (PD) and its usefulness for constructing individual dose-response models.

    METHODS: Twenty-five patients with PD performed a series of tasks throughout a levodopa challenge while wearing sensors. TRIS was used to determine motor changes in pronation-supination tests following a single levodopa dose, and was compared to clinical ratings including the Treatment Response Scale (TRS) and six sub-items of the UPDRS part III.

    RESULTS: As expected, correlations between TRIS and clinical ratings were lower in the new population than in the initial study. TRIS was still significantly correlated to TRS (rs = 0.23, P < 0.001) with a root mean square error (RMSE) of 1.33. For the patients (n = 17) with a good levodopa response and clear motor fluctuations, a stronger correlation was found (rs = 0.38, RMSE = 1.29, P < 0.001). The mean TRIS increased significantly when patients went from the practically defined off to their best on state (P = 0.024). Individual dose-response models could be fitted for more participants when TRIS was used for modelling than when TRS ratings were used.

    CONCLUSION: The objective sensor index shows promise for constructing individual dose-response models, but further evaluations and retraining of the TRIS algorithm are desirable to improve its performance and to ensure its clinical effectiveness.

  • 144.
    Jomaa, Diala
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Fingerprint Segmentation2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In this thesis, a new algorithm has been proposed to segment the foreground of the fingerprint from the image under consideration. The algorithm uses three features, mean, variance and coherence. Based on these features, a rule system is built to help the algorithm to efficiently segment the image. In addition, the proposed algorithm combine split and merge with modified Otsu. Both enhancements techniques such as Gaussian filter and histogram equalization are applied to enhance and improve the quality of the image. Finally, a post processing technique is implemented to counter the undesirable effect in the segmented image. Fingerprint recognition system is one of the oldest recognition systems in biometrics techniques. Everyone have a unique and unchangeable fingerprint. Based on this uniqueness and distinctness, fingerprint identification has been used in many applications for a long period. A fingerprint image is a pattern which consists of two regions, foreground and background. The foreground contains all important information needed in the automatic fingerprint recognition systems. However, the background is a noisy region that contributes to the extraction of false minutiae in the system. To avoid the extraction of false minutiae, there are many steps which should be followed such as preprocessing and enhancement. One of these steps is the transformation of the fingerprint image from gray-scale image to black and white image. This transformation is called segmentation or binarization. The aim for fingerprint segmentation is to separate the foreground from the background. Due to the nature of fingerprint image, the segmentation becomes an important and challenging task. The proposed algorithm is applied on FVC2000 database. Manual examinations from human experts show that the proposed algorithm provides an efficient segmentation results. These improved results are demonstrating in diverse experiments.

  • 145.
    Jomaa, Diala
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Dougherty, Mark
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Yella, Siril
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Triggering Radar Speed Warning signs using Association Rules and Clustering Techniques2012Conference paper (Refereed)
    Abstract [en]

    Radar speed warning signs (RSWS) have been used in recent years across Sweden and elsewhere in the world. Such signs measure vehicle speed using radar and are designed to display a message when the driver exceeds a pre-set threshold speed, which is often relative to the speed limit on a particular road segment. RSWS are typically placed on locations which are perceived to be problematic by relevant authorities. Excessive speeding or road accidents are examples of such perceived problems.  Deploying RSWS in many relevant locations is often impractical due to the lack of necessary power supply needed for operation. Battery driven RSWS are an alternative but are less attractive because of limited running time and frequent maintenance (changing batteries etc). Therefore, solar powered RSWS are more desirable. However, these signs are also dependent on batteries that need to be charged. The duration of operation of solar powered RSWS largely depend on how often the sign is triggered. Constant activation of the sign drains the battery. It is desirable to trigger the sign only when necessary. Hence, the main goal of this research is to design a model that optimises the performance of RSWS depending on prevailing conditions i.e traffic flows during different times of the day and so on.  Vehicle speed data had been collected at a test site in Sweden all hours of the day. This paper attempts to use a hybrid system based on Apriori and K-means clustering algorithm. Apriori algorithm is simple and efficient to determine associations’ rules among attributes in particular to discover the most common combination that can occur within the data set.  K-means clustering is basically used to quantize the input variables into smaller clusters that can easily derive the trigger threshold value. The proposed hybrid system indicated that the system was able to trigger solar RWWS efficiently.

  • 146.
    Jomaa, Diala
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Dougherty, Mark
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Yella, Siril
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Edvardsson, Karin
    Dalarna University, School of Technology and Business Studies, Construction.
    Effectiveness of trigger speed of vehicle-activated signs on mean and standard deviation of speed2016In: Journal of Transportation Safety and Security, ISSN 1943-9962, Vol. 8, no 4, p. 293-309Article in journal (Refereed)
    Abstract [en]

    Excessive or inappropriate speeds are a key factor in traffic fatalities and crashes. Vehicle-activated signs (VASs) are therefore being extensively used to reduce speeding to increase traffic safety. A VAS is triggered by an individual vehicle when the driver exceeds a speed threshold, otherwise known as trigger speed (TS). The TS is usually set to a constant, normally proportional to the speed limit on the particular segment of road. Decisions concerning the TS largely depend on the local traffic authorities. The primary objective of this article is to help authorities determine the TS that gives an optimal effect on the Mean and Standard Deviation of speed. The data were systematically collected using radar technology whilst varying the TS. The results show that when the applied TS was set near the speed limit, the standard deviation was high. However, the Standard Deviation decreased substantially when the threshold was set to the 85th percentile. This decrease occurred without a significant increase in the mean speed. It is concluded that the optimal threshold speed should approximate the 85th percentile, though VASs should ideally be individually calibrated to the traffic conditions at each site.

  • 147.
    Jomaa, Diala
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Dougherty, Mark
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Yella, Siril
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Edvardsson, Karin
    Dalarna University, School of Technology and Business Studies, Construction.
    Effectiveness of vehicle activated signs on mean speed and standard deviation of vehicle speed2014Report (Other academic)
  • 148.
    Jomaa, Diala
    et al.
    Dalarna University, School of Technology and Business Studies, Microdata Analysis.
    Yella, Siril
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Automatic trigger speed for vehicle activated signs using Adaptive Neuro fuzzy system and Random ForestIn: International Journal on Advances in Intelligent Systems, ISSN 1942-2679, E-ISSN 1942-2679Article in journal (Refereed)
  • 149.
    Jomaa, Diala
    et al.
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Yella, Siril
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Dynamic trigger speed for vehicle activated signs2016Conference paper (Refereed)
    Abstract [en]

    Optimal trigger speeds for vehicle activated signs were not considered in previous studies. The main aim of this paper is to summarise the findings of optimum trigger speed for vehicle activated signs. A secondary aim is to be able to build and report a dynamic trigger speed based on an accurate predictive model to be able to trigger operation of vehicle activated signs. A data based calibration method for the radar used in the experiment has been developed and evaluated. Results from the study indicate that the optimal trigger speed should be primarily aimed at lowering the standard deviation. Results also indicate that the optimal trigger speed should be set near the 85th percentile speed, to be able to lower the standard deviation. A comparative study investigating the use of several predictive models showed that random forest is an appropriate model to dynamically predict trigger speeds.

  • 150.
    Jomaa, Diala
    et al.
    Dalarna University, School of Technology and Business Studies, Microdata Analysis.
    Yella, Siril
    Dalarna University, School of Technology and Business Studies, Computer Engineering.
    Predicting automatic trigger speed for vehicle-activated signs2018In: Journal of Intelligent Systems, ISSN 0334-1860, E-ISSN 2191-026XArticle in journal (Refereed)
    Abstract [en]

    Vehicle-activated signs (VAS) are speed-warning signs activated by radar when the driver speed exceeds a pre-set threshold, i.e. the trigger speed. The trigger speed is often set relative to the speed limit and is displayed for all types of vehicles. It is our opinion that having a static setting for the trigger speed may be inappropriate, given that traffic and road conditions are dynamic in nature. Further, different vehicle classes (mainly cars and trucks) behave differently, so a uniform trigger speed of such signs may be inappropriate to warn different types of vehicles. The current study aims to investigate an automatic VAS, i.e. one that could warn vehicle users with an appropriate trigger speed by taking into account vehicle types and road conditions. We therefore investigated different vehicle classes, their speeds, and the time of day to be able to conclude whether different trigger speeds of VAS are essential or not. The current study is entirely data driven; data are initially presented to a self-organising map (SOM) to be able to partition the data into different clusters, i.e. vehicle classes. Speed, time of day, and length of vehicle were supplied as inputs to the SOM. Further, the 85th percentile speed for the next hour is predicted using appropriate prediction models. Adaptive neuro-fuzzy inference systems and random forest (RF) were chosen for speed prediction; the mean speed, traffic flow, and standard deviation of vehicle speeds were supplied as inputs for the prediction models. The results achieved in this work show that RF is a reliable model in terms of accuracy and efficiency, and can be used in finding appropriate trigger speeds for an automatic VAS. 

1234567 101 - 150 of 363
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • chicago-author-date
  • chicago-note-bibliography
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf