(In alphabetical order of authors)
Displacement Sensor for Detecting Sub-Micrometer Motion
Development of new reliable sensors to measure sub-micrometer
displacement of moving objects is required by continuous size reduction of
state-of-the-art nano-scale devices. The most challenging sensor design
requirements are nano application dependent on shape, size of the area for
sensor placement and electrical connections. The active sensing area has to
be created only by the sensor itself and it cannot incorporate the moving
object. Common nano applications involve objects that cannot tolerate
additional weight from electrical connections and/or are made from materials
that cannot be electrically charged. These limitations lead to the design of
a very simple sensor that has a small footprint.
Challenges and Solutions in Agricultural Robotics
In the year 2050, it is estimated that the world population will
be around 9 billon people. Based on limited land and the yearly increase in
the number of natural disasters destroying fields, expert expect that the
food production must be increased nearly by a factor of two compared to the
state of today. For solving this problem, the automation and optimization of
agricultural processes must be extended. Agricultural robotics plays a key
role for this purpose.
Energy Efficiency in the Future Internet
According to major Telecom operators worldwide, there is a
significant need for Future Internet devices and network infrastructures to
be more energy-efficient, scalable and flexible, in order to realize the
extremely virtualized and optimized networks needed to effectively and efficiently
support a very large number of heterogeneous user-led services. In the
computing world, an important recent trend has been the move to energy
proportionality, i.e., the goal of having energy expenditure in proportion to
the instantaneous (rather than the peak) computational load. This has
motivated the adoption of virtualization and cloud computing as methods to
deliver software services. These developments save power in computation, but
increase the load on datacenter networks and on the Internet that supplies
them their data. Thus, the goal of energy proportionality has been extended
to datacenter networks and the Internet at large. However, the complex
interactions between the energy consumed by virtualized servers, the server
farms on which they execute, the datacenter networks that interconnect them,
and the wider network from which users access services and data, require a
holistic approach to energy efficiency, capable of embracing many different
aspects and basic strategies of current ICT and network technologies, where
the ultimate overall goal should be the rational usage of all physical
resources. In this perspective, energy efficiency (with respect to a
non-optimized exploitation of ICT equipment) may be viewed as an indicator of
the "health" of the overall computing and networking ecosystem. It
reflects the extent of exploitation of computing, storage, and communications
hardware capabilities to the degree needed to support the current workload
generated by applications at the required Quality of Service/Experience
(QoS/QoE) level. In this respect, flexibility and programmability in the
usage of physical resources (obviously including the network) come naturally
onto the scene as instruments that allow optimal dynamic resource allocation
strategies to be really implemented in practice. The goal of such
optimization can actually be energy efficiency, but it will be achieved under
dynamic adaptation to the quality requirements imposed by running
Tom Kazmierski and Matthew Walker
A vision for the many-core dream
Recent many-core architectures, that is, architectures with hundreds
or more cores, suffer from a number of fundamental problems. For one, a
one-hundred ARM core system would not fit physically on a SoC that powers a
mobile or a tablet device. Many current many-core prototypes use existing ISA
cores that have been developed for traditional multi-core processors.
However, for a scaleable, ultra-low-energy many-core architecture to be
successful new designs are needed that would result in smaller and simpler
cores. Additionally, current parallel software is not parallel enough. Most
applications running on smartphones do not make effective use of the two or
four cores that are available on smartphone processors. In a many-core
system, networking issues become difficult to solve due to the need to
provide data communication and resource sharing between hundreds of cores.
Not only there is a huge number of wires needed between the cores themselves
to allow data transfers, but complex switching is required which uses a
considerable amount of space and also the communication network consumes a
large proportion of the overall energy of the processor.
Integration of robot assistants and electrical stimulation for neurorehabilitation
This presentation is about new personal robot assistants which allow a significantly improved outcome of rehabilitation of humans with upper limbs impairment. These robots provide an immediate hand/arm functioning, but also result with therapeutic (carry-over) effects. These systems contribute to the functioning of patients after central nervous system having paretic or paralyzed upper limbs because they provide a desired controlled haptic assistance that is integrated into the preserved sensory-motor systems. These systems are hybrids: they integrate a multichannel functional electrical stimulation providing the grasp/ release and the robot assistant. The hybrid system increases the motivation to exercise, activate afferent nerves and contribute to the changes in the cortical excitability and cortical plasticity. The modularity allows the use of components instead of the whole system based on the level of impairment. These systems implement control that integrates learning from examples allowing a clinician to set the level of assistance during the treatment; thereby, the treatment can be adjusted to fit to the level of impairment. Humans after a stroke are the major beneficiaries from the system but other disabled humans could also benefit (e.g., CP, tetraplegia, surgery recovery). The performance of the system was validated in case series in the clinical environment.
Nuclear Power in the World: Dusk or Dawn
The first commercial nuclear power plants were built in the early
1960s. The new construction starts peaked in the late 1970s. Two accidents,
the 1979 Three Mile Island accident in the US and the 1986 Chernobyl accident
in USSR, led to phase-outs, slowdowns and moratoriums in several countries,
including the USA. Many predicted that nuclear power is coming to its dusk.
However, the need for base-load power and lower electricity prices, excellent
performance of operating plants, and worry about fossil fuel emissions
and climate change, led to a nuclear revival in early 2000s. The recession of
2007-2008, the focus on renewable sources of electricity, and the 2011
Fukushima Daiichi accident in Japan have resulted in slowdowns, moratoriums
and phase-outs in some western countries. For example, Germany decided to
close eight nuclear power plants and Japan ordered shutdown of 48 plants for
inspection. Many predicted a new dusk for nuclear power. Four years after the
Fukushima Daiichi accident and seven years after the last large world
recession, the question reminds: what is to be expected regarding the nuclear
power in the future – a dusk or new down?
(In alphabetical order of authors)
Wireless Technology for Physiological Monitoring
Recent advances in wireless communications and sensor technologies have opened doors towards wireless physiological monitoring. In this presentation we will discuss methods of noninvasive physiological sensing, and its applications from medicine to clean energy. Taking advantage of wireless technology advances, Doppler radar physiological sensing has a potential to provide a compact, low cost platform for cardiopulmonary measurements. Application examples include sleep medicine, wearable systems, and occupancy sensing.
Vesna Crnojević Bengin
Near-Zero Metamaterials and Their Application to the Design of Microstrip Filters (with a short excursion to the acoustical domain)
Metamaterial theory and techniques have passed almost 15 years of
evolution. One of the established outcomes of this development are so called
Near-Zero (NZ) metamaterials which exhibit a new physical effect of achieving
zero value of propagation coefficient at non-zero frequencies. The talk will
provide insights into various types of NZ metamatreials and show that
microwave filters based on this new physical effect can have resonators much
shorter than the guided wavelength, thus paving the the way for
micro-miniaturization of filters.
Marco Ceccarelli and Giuseppe Carbone
New Challenges in Service Robotics
Service robotics expands successfully applications of robotic systems in more and more non-industrial areas. New tasks are affordable by robotic systems when the peculiarities of the applications are well understood not only for technological developments but also for users’ acceptance and handsome operation. Examples of emerging challenging interests are discussed in this keynote paper as referring to robot applications in areas for surveying and restoration of goods of Cultural Heritage s well as for rehabilitating and exercising limbs of elderly people or injured patients. Mechanical aspects and features are stresses as important roles that can be successfully addressed when proper mechatronic design makes possible user-oriented operation and understanding.
Diaphragms in Rectangular Waveguide: An Approach Based on Singular Integral Equation
In this paper some elementary facts about singular integral equations and related topics are presented and used to analyze diaphragms in a rectangular waveguide. Simple, but accurate enough, formulas for diaphragm susceptance are obtained and some numerical examples are given for a comparison with the results obtained from more accurate formulas.
Systems for analog and mixed signal processing in integrated technology
Despite the overwhelming trend of digitization of electronic
systems, and many indisputable advantages of having digital signal processing
compared to analog, hardware towards the real, outside world, is a part of
the overall mixed system and remains mostly analog. This analog part, in most
IC-system chips is well known as "analog-front-end" (AFE).
Therefore, the analog electronic circuits as part of an AFE, not only will
remain an important part of most integrated systems, but often represent one
of the bottlenecks in achieving low power consumption and small area on an
integrated circuit. This explains why the development and design of new and
improved (from the standpoint of IC design) analog electrical circuits still
plays an important role - and will likely continue to play an important role
- in the development of new and advanced systems-on-chip.
Low-cost and miniature passive sensor interfacing based on microcontrollers
Almost every microcontroller-based measurement system contains a
sensor. When using passive sensors, the overall performance of the system
greatly depends on the applied sensor interface. There are a lot of known passive
sensor interfaces for microcontrollers with different properties in terms of
cost, size and performance. Depending on the application, some interface
properties take precedence over the others. When it comes to cost and size,
the direct sensor-to-microcontroller interface approach is advantageous. This
sensor interface uses time-to-digital (TD) conversion to estimate the sensor
measurand without analog to digital (AD) converter. The application of the
sensor interface covers all types of passive sensors (resistive, capacitive
and inductive), all sensor configurations (single-ended, differential and
bridge), and yet includes calibration.
Ivica Kostanić, Hamad Almohamedh and Fahad
Non-Referenced Objective Streaming Video Quality Evaluation in Cellular Networks of 3rd and 4th Generation
This talk presents a novel methodology for quality evaluation of streaming video data services. The methodology requires no reference and it predicts subjective experience of the video quality. The predictions are based on a nonlinear mapping between objective technical metrics collected by the user equipment and subjective scores given by human evaluators. The objective metrics may be taken from various levels of the protocol stack. In the current implementation, the nonlinear mapping is accomplished through a neural network.The performance of the methodology has been tested using data UDP streaming video services over LTE (4G) and HSPA (3G). The agreement between the predictions and subjective quality evaluation scores is excellent.
Multiple-Resonator-Based Harmonic Analysis
A resonator-based observer is a recursive algorithm which can be used for the calculation of the harmonic components of periodic signals. One of the advantages of the recursive spectrum estimation algorithms is that they have better tracking properties than block-based methods (e.g., DFT: Discrete Fourier Transform). This is particularly important when the spectrum estimation is used in real-time systems. In addition to that, the frequency of the signal does not need to be on the DFT grid. By introducing multiple-resonators, i.e. cascades of identical resonators in parallel, the classical frequency sampling method based on the direct utilization of the Lagrange interpolation technique, corresponding to the single-resonator-based structure, is extended to a rather efficient Hermite interpolation scheme. In addition, the output taps of the multiple resonators may fix not only the complex harmonic values but also, according to the actual resonator multiplicity, their first, second, third, fourth, and so on, derivatives at the corresponding frequency. In order to adapt the achieved digital differentiators to their optimized frequency responses around the harmonic frequencies, it is possible to reshape the filters transfer functions. The estimation technique is suitable for application in a wide range of frequency changes, transient conditions, and interharmonic presence, with benefits in a reduced complexity and computational effort. To demonstrate the performance of the developed algorithm, computer simulated data records are processed.
Magnetic behaviour of mechanically milled ZnO: influence of milling media
Dilute magnetic semiconductors (DMS) can be potentially applied
for spintronic devices due to their coupling of electronic and magnetic
properties. Recently, experimental and theoretical investigations of DMS have
drawn much attention. ZnO is a transparent semiconductor, and also it is a
cheap material with a rich variety of properties. It has been widely
investigated in the past decades because of its applications in
optoelectronic, piezoelectric, optical, and other fields. When doped with
other transition metals, ZnO can be transformed into an interesting DMS.
Various experimental techniques can be used to fabricate ZnO-based DMS.
Depending on the synthesis conditions, different nanoscale ZnO materials,
exhibiting different magnetic properties, can be obtained. Milling in
high-energy mills is one of the methods for preparation of nanocrystalline
powders and, in particular, ZnO powder. If the milling assembly is carefully
selected, including the milling media and parameters, it is possible to
obtain large amounts of powder, with accurate control of particle and
crystallite size, amount and types of defects and impurities.
Milorad Tošić, Zorica Nikolić, Valentina Nejković,
Bojan Dimitrijević, Nenad Milošević
Spectrum Sensing Coordination for FIRE LTE testbeds
One of the main challenges for modern telecommunication systems is the pressure to keep up with constantly increasing requirements for higher data rates. This is particularly critical for mobile communication systems, due to limited spectrum availability. Even though the Long Term Evolution (LTE) standard provides high data rates, it may not be enough for future demands. The higher spectrum bandwidth can be achieved through the combined use of licensed and unlicensed spectrum. LTE-Advanced has introduced a mechanism, named carrier aggregation, which is the key technology that enables the unlicensed spectrum usage (LTE-U). However, in order to use the unlicensed spectrum, there has to be a coordination between the LTE and the native unlicensed spectrum users, such as WiFi or Bluetooth. The first step in the spectrum usage coordination is to obtain the spectrum usage data by spectrum sensing. This paper presents an experimentation framework under development that will be able to support LTE-U experimentation involving different spectrum sensing and spectrum coordination mechanisms. Semantic descriptions and spectrum sensing ontology will be adopted to facilitate cognitive coordination mechanisms. The experimentation is aimed at the coordination between the WiFi and LTE users at 5GHz band, but because of the semantic descriptions, it is flexible enough to support other frequency bands and other technologies as well.
Dušan M. Stipanović
Controlling Multiple Agents with Multiple Objectives
The challenges of controlling multiple agents with multiple objectives are not only related to but include problems in multi-player dynamic games, multi-objective optimization, and decentralized control and estimation. The additional complexity is introduced through agents’ dynamic models with possible nonlinearities, delays and perturbations as well as various state, input and communication constraints. In this talk we will present a number of results related to control and coordination of multi-agent dynamic systems with multiple objectives. As an illustration, some particular examples of multiple agent systems achieving multiple objectives such as guaranteed capture or evasion, collision avoidance, coverage control, proximity, and tracking, will be presented.
Applications of sound field analysis and synthesis in 3D audio context
The recent development of massive arrays of microphones or loudspeakers has stimulated numerous studies on sound field analysis and synthesis together with the development of 3D audio applications that offer a refined auditory experience. Advanced 3D audio techniques such as High Order Ambisonics (HOA) or Wave Field Synthesis (WFS) rely on large arrays of loudspeakers distributed on the room boundaries. The radiation properties of a sound source may be simulated by digitally controlled spherical loudspeaker arrays (LSA). On the recording side, spherical microphone arrays (SMA) are used to capture a soundscape or a musical ensemble performance with high spatial resolution. In room acoustics, high-resolution sound field characterization can be achieved by measuring directional room impulse responses (DRIR) that combine microphone and loudspeaker arrays. The measured DRIRs may be then exploited in convolution-based reverberators for the auralization of room acoustics with faithful rendering of its spatial attributes or for 3D audio-mixing applications. In this particular context, the sound engineer will typically want to fine tune the perceptual attributes of the original DRIRs in order to better fit the aesthetic of the mixing. Such parametric control first requires the development of an analysis-synthesis framework that operates on a space-time-frequency representation of the DRIRs. The theoretical and perceptual properties of these spatialization techniques are presented and illustrated in various contexts ranging from music performance, post-production and broadcast to virtual reality applications. Meanwhile, the ever-growing expansion of mobile devices calls for the deployment of broadcast solutions able to deliver 3D audio content and that allow for a personalized binaural rendering over headphones on the end user side.