Republicii Street, No. 9, TM, Romania +40-256-201279
Tech

The vulnerability of Industrial Control Systems (ICS) systems is a major concern both for their beneficiaries, as well as for all those involved in the development of such software solutions. While software engineers (in and beyond ICS), know that a solid software architecture and reliable coding are the only way to go to reduce cyber risks, the users sometimes neglect the best practices in the daily use of software solutions.  When steering away from code that ends up being vulnerable, or easy to control via data breaches, as well as from its faulty deployment or configuration, the ICS-based recommendation can do no harm. They are also useful when considering other types of software solutions.

We browsed a few recommended methods for developing strong and reliable ICS software solutions, on the users’ side.

The ICS system environment induces an urgent need to solve or avoid vulnerabilities. This urgency drives best practices in this line of work. By extension, any solution with implications or application in the IoT field can be approached in the same way, even if the effects of its potential vulnerabilities are sometimes more difficult to grasp. But any element in an interconnected system is essential because the system is as protected as its weakest element is.

 

A seven steps list to avoid cyber risks, from Automation World

The list comes from a supplier of industrial cybersecurity software and services. It’s meant to provide a solid grounding for all the types of professionals who are involved in building and delivering safe and secure ICS software solutions.

 

As the author mentions, the list summarizes the “core steps every industrial company should take to secure their control systems at the most basic level”

Design your network with cyber security in mind

  • The software solutions beneficiaries/users should secure their networks to avoid exposure

Monitor your deployed software

  • Make sure you notice events and any abnormal behavior before it’s too late

Keep a tight inventory of all devices connected to your network

  • “From controllers to human-machine interfaces (HMIs) to engineering workstations, all assets on your network should be accurately inventoried so there aren’t any unknown devices, thereby enabling rogue assets to be quickly identified”.

Manage your logs, to understand the behavior of all involved devices

  • You can optimize performance once you have the complete image

Manage the configurations of all involved devices

Use industrial firewalls

Institute privilege control

 

As a user of software solutions, do these recommendations sound familiar?

Avoiding cyber risks is a common effort. Good software engineers strive to build and deliver the best, most secure products. Vigilant users should join in by materializing best practices, such as those mentioned above.

0

Tech

Due to financial restraints and reluctance, as well as to economic differences, the way in which organizations worldwide implement and approach new tech (from the digitization, automation, Big Data, analytics, connectivity, IoT, etc. group) is inconsistent.

For example, we took a look at the random user intercept site survey fielded on ZDNet and TechRepublic, based on European users having voluntarily answered tech-related questions. The results in its first phase (responses in October 2018, coming from 142 qualified users) show very low active IoT initiatives and a low budget allocation for this sector.

Meanwhile, the mainstream experts note that investing in new tech is no longer a luxury, but “essential to keeping your business’s costs down, your profitability up, and your company thriving”.

 

A glimpse at the IoT-related situation in the European field

 

Back to the study mentioned above, let’s see a couple of figures:

 

*42% of the respondents say they have an interest in IoT, but have no active initiatives;

 24% of them say they have a limited number of IoT-related functions in operation;

 

*1-9% of the respondents mention that 36% of their budget is directly related to their IoT-based initiatives;

 10% say that over 50% or more of their budget is directly related to their IoT-based initiatives

 

*a maximum of 18% of the respondents list IoT as a top priority in their organization, for various lucrative purposes

*a maximum of 49% of the respondents list IoT as a top 3 priority in their organization (figures vary, depending on purpose)

*51% of the respondents list ensuring the security of IoT network/devices as the top challenge in using IoT in their company, followed by integrating IoT into their proprietary networks and by the learning curves related to the new technology

For those of us who work in B2B it should be easy to compare these with the situation in the field as we know it. Although this study so far provides only partial unrefined results, it clearly shows polarization around certain issues.

 

A few recommendations for the organizations caught in transition

For those who are interested in new tech, but still are indecisive whether to see it as a luxury or a necessity, the article we found on ReadWrite provides a schematic guidance. Such organizations should consider the following:

  • Quality software should always be the choice, and furthermore the companies should spend money to ensure they have the best technology expertise. Going for doubtful software and no experts is simply self-sabotage.
  • New tech in streamlining business processes is the way to go, since it levels the playing field with your peers, as well as your customers. Predictive maintenance and reducing errors via technology is so this age, that those who are top in their field can’t even conceive functioning without it.
  • A strong relationship with the modern customer means new tech, which makes sure the customer experience is within the modern, expected parameters in the industry.

For more details on how businesses should “integrate new tech into (their) business in a way that cuts (…) costs, increases revenue, or provides more value to (…) customers”, see the source article here.

0

News

This week we focused on autonomous vehicles, inspired by this Innovation Enterprise post that targets the decisional factors inside organizations. All those concerned need to prepare in advance for the incoming changes that are set to affect industry after industry. Just to have an idea about the scale of the disruptions, their certainty and their character, let’s go through this topic, the way the mentioned publication approached it.

Autonomous vehicles will pervade the logistics and transportation fields, and the legal adjustments point to this

Firstly, we should notice that the article looks at the way autonomous vehicles progress in the US environment. Therefore the presented implications and legal changes are in relation with this geographical area, for now.

The Department of Transportation recently issued a new guidelines document of 70 pages. The officials stated that they changed the definition of “operator” and “driver”, to allow for AI-driven vehicles.

Based on this major change, those in charge of businesses in which such vehicles might play a role in the future should prepare in advance. Exactly how will autonomous vehicles impact such businesses? Advice for determining the future impact of the changes to come is included in the article.

Mapping out what companies will drive this type of change is yet another step in the process of getting ready for it. Autonomous vehicles come with AI (Artificial Intelligence). Companies such as Google, Tesla, or even Dominos Pizza made no secret out of their AI commitment.

Besides keeping an eye for the moves of such innovators/adopters, investing in stocks related to AI is a second option many will consider.

 

Going fully driverless means high technology and overcoming any security issues in autonomous vehicles

Analysts have estimated that going fully driverless is an option that could become real in transportation around 2030 or half 2040s. We are talking trucks and goods delivery, and the figures already point out considerable cost saving benefits and not only.

The road to this stage may seem long and surely it’s a winding one. However, the first official and regulatory step may well be the new definition of what a driver is.

On the other side, road safety and cyber security concerns show the need for high-performance, fault proof software and hardware. Those solutions that guarantee the safety of all those taking part in traffic are the only way to move forward.

Going fully driverless in transportation and delivery does not yet have consumer acceptance. Autonomous vehicles still have some way to go until convincing the majority of people they are in no way a risk on the road. In any case, they still have to prove not being a bigger risk than the vehicles with human drivers.

Autonomous trucks generally officially operate at what’s known as Level 2, an engineering standard that includes technologies such as automatic braking, acceleration, and some amount of steering. (Basic cruise control, by contrast, would amounts to Level 0, and certain features such as lane-assist or adaptive cruise control would be Level 1). However, autonomous trucks are often effectively operating at Level 4 – or “high automation,” with their safety drivers generally only taking over on local roads.
– Richard Bishop, an automated vehicles industry analyst, quoted by US News

Image credit: curbed.com

 

 

0

Tech

Tech innovations can sometimes originate in simple ideas. They can also reproduce natural solutions. By observing similarities and connecting apparently disparate things, a Eureka moment is sometimes not far away. SAP Digital Interconnect shares an interesting parallelism between bees and sensors, and consequently between a smart city and a hive.

The common denominator consists of secure and efficient communications. When designing the IoT systems, this is often the underlining scope – getting the right data where you want it, while avoiding unauthorized access.

Let’s follow this story and see why SAP thinks that designers and managers have a lot to learn from the way bumblebees are organized.

 

Connected cities – the super-organisms of tomorrow

The overall tech weaving that goes into any smart city structure is impressive. The result is a sum of subsystems, a mega-system. With all the data flows taking place, the interconnections, and the necessary coordination – we may see it as a super-organism. How can the professionals design and program it, so that it would work as such?

It wouldn’t be for the first time in human progress that nature inspired humans in solving complicated problems. By getting our inspiration from the natural structures that work just fine for centuries, the easiest answer is often the best one.

The source article’s coauthors are also beekeepers. In this quality they have noticed how a beehive is in fact a collection of different cells. It’s the relationship between these cells and the way the bees understand to enact it that makes sure the whole system works together in a flawless way.

In a similar way, the design and management of IoT networks could make sure that all the sensors communicate seamlessly, that their hierarchy is respected, and their functions fulfilled.

What drew the attention of the two authors (one a Head of IoT products, and the other a Mobile Evangelist), are the pheromones. They are the common element that may activate or block communications, depending on the necessity. In a network packed with sensors that have a double function – independent and interconnected, this is critical.

 

Looking forward for a highly functional, organized IoT environment

The perspective of a medium packed with sensors, devices, data flows and streaming information does not rhyme with functionality, nor does it inspire an ergonomic structure.

However, layering, categories and distributed activities could make it so that the sensors take turns in communicating with each other. For example, when collecting data that is not needed in real time, the sensors could save activity and bandwidth and stay in a local mode. When receiving the designed signal, they could go into transmit mode.

By using the bees’ behavior as a model, the authors recommend three scopes in planning various systems that go in a smart city structure:

  • Simplicity;
  • Security;
  • Smart communication.

Following these guidelines (that you may see explained, by accessing the source article), any involved professional may get a better version of the system he/she is in responsible for.

 

0

Tech

City planners and technologists joined forces at Smart Cities Week 2018 in an exercise of simulation for businesses. The theme was the future the smart city, which is not at all far-fetched. All around the globe initiatives for intelligent, connected IoT-based urban centers are in various stages of development.

To understand the full scale of requirements, challenges and multiple tasks involved by such a mega structure, we browsed a recap of the simulation, as featured on IoT for All.

 

Things to expect when involved in a smart city project

Although the increased specialization in the technical field over the years may not equally put all participants in a project in a similar strategic posture, certain major challenges are bound to affect all. Even if layered in complementary projects, micro projects and various team types, the professionals who are part of the overall team do have to acknowledge the bigger picture.

What would a few of the common challenges might be? Lateral thinking is one of them. For the success of such a venture, those who make decisions, as well as those who decide for their own subsidiary area of responsibility need to bring an indirect and creative approach to the table.

Balancing the budget against what are indeed the much needed strategic investments is yet another must. Investing in new technologies blends together with retrofitting older structures. However, blending means seamless integration. The participants need to think things through. They also need to have a can do attitude, to grasp the big picture, as well as be quickly aware of the possible pressure or blockage points.

 

A process of bartering and flexible adaptation

The simulation in cause was extremely interesting due to the fact that it reunited city planners with the specialist in technology.  While the professionals in urban planning brought an approach forged during the specifics of their everyday activity, some of the technologists had their first opportunity to see how and why this type of project is different.

City planners are often reluctant to changes, stick by their budgets and need to understand the technology and translate it into financial lines. Understanding the technology alone took a while for some of the participants. The technologists had to meet them halfway in relation with certain issues.

Collaboration and strategy are key on the road to a functional smart city – as the author repeatedly underlines it. The aim and the measurement elements is the citizen quality of life – a notion that seems at hand, but needs defining. It also requires a sustained activity on the line of planning, so it would indeed materialize into better services, better living conditions and so on.

 

Keywords for activities related to the smart cities of the future

We selected a few keywords that might not be the first things that come to mind when contemplating the idea of smart city:

  • Multiple stakeholders
  • Constrained budgets
  • Challenging social dynamics

Other keywords are less surprising:

  • Economic growth
  • Intuitive infrastructure
  • Public-Private Partnerships
  • Collaborative processes

Materializing smart city projects may involve a sum of integrations. Some software solution can be adopted into such a project, while they previously existed in the private environment. Others can regard custom-made software programs and applications.

Whatever the case, having the right partner in software engineering is of great use. Our company is an example of such a partner – with the experience accumulated in our projects along the years, we can tackle IoT and mobile challenges related to the major ventures of the future.

Image credit

0

News

Recently, our colleague Petrut Ionel Romeo, Technical Project Manager, Lasting Software held the presentation entitled Satellite Communication solution for Cellular backhaul, as part of Codecamp Timişoara, the autumn edition.

In what is actually a solid business case for satellite – based mobile communications, he outlined the importance of this technology and its 2 main current key purposes, in a way that would make it clear and palatable by non-specialists and technical people altogether.

We will briefly recap the main ideas, leaving aside the more technical content. Nevertheless, let your imagination add to this material the context of a busy, dynamic conference, and you will have the image of a great day…

 

Satellite – related facts

Satellites (communication and weather), are situated in the geostationary orbit. This is to make it easier for the Earth-based antennae to maintain continuous communication. A geostationary orbit allows for the position of the orbiting object to remain constant.  To a stationary observer from Earth, it seems motionless.

In time, the number of satellites placed in the geostationary orbit became high enough to justify the new ring system made by all these devices being called the Geostationary Satellite Belt.

Having to control and maintain sophisticated pieces of technology at such a remote distance is a huge task. There are specific challenges linked to the system of satellites surrounding the Earth. Although by developing the technology at an impressive rate, the previous reach, maneuver and management issues have been solved in ways that were not possible a while ago, the physical premises (and not only) make satellite technology a highly demanding field.

Satellites operate in extreme thermal conditions, risk being hit by space debris, face the hostility of radiation belts and so on. Perhaps the simplest challenge to understand of them all would be that satellites are remote devices, with a specific procedure in place in order to access, maintain and repair.

By taking a look at the distance-related communication challenges, we may consider how:

  • The Clarke belt (the part of space in the plane of the equator, designated implementation area for near-geostationary orbits) is about 36.000km above sea level;
  • When speaking of communication back and forth, we are dealing with 72000km
  • Taking into consideration the speed of light, the resulting delay in transmissions is of approximately 250-270ms

However, despite these challenges, communication satellites also have a few particular traits that put them above any other communication-enabling technology located on Earth.

The unmatchable advantages of the communication satellites

Satellites are extremely important because in certain locations and/or circumstances, the terrestrial communication infrastructure is inefficient or even completely out of range/function.

There are 2 major (and we may add critical) situations where only satellite-enabled communications can support the flow of necessary data and messages.

  1. Geographically-isolated or architecturally-isolated spots
  • Often in rural and remote areas fiber and microwave transmissions are unavailable, as there is no business interest
  • Satellite communications are also efficient in mountains, deserts, islands, in other areas difficult to reach due to the landforms and/or structure
  • This type of communications may be of use in order to relieve congested urban areas: stadiums, malls, markets, academic centers
  1. Situations that require a solid communication infrastructure for emergency response

Satellite can act as a trustworthy fiber backup, since fiber is not that reliable. Faced with earthquakes or other cataclysms, both landline and fiber systems are easily down, unlike satellite-enabled communications, which become crucial for keeping in touch, monitoring and intervening in such moments.

 

Communication Satellites Architecture Options

In accordance with their type and with the communication needs they have to meet, satellites usually feature one of the following main architectural types, when it comes to data transmissions:

  1. Point to point links
    • When there is a limited number of links
    • For nb interface, for example
    • For very high speed trunks: +300 mbps
  1. Hub and Star architecture
  • Multiplex on forward and return
  • Smaller remote equipment

 

Mobile backhaul: key considerations

Mobile backhaul (MBH) is the process of data and voice transportation from distributed network sites to the network core. It enables mobile users to access the main data centers that host the content and applications.

Terrestrial backhaul is the traditional method. For a long while mobile and satellites existed in parallel, due to the costs involved in satellite operations. Once the more recent advances in satellite technology reduced the bandwidth costs – for example High Throughput Satellites (HTS) managed to bring an up to 70 % saving – satellite backhaul became a viable, attractive solution which is about to gain traction.

What does the shift towards satellite backhaul involve, as far as the mobile communications are concerned?

  • Mobile networks: Quality of Service (QoS) and Service Level Agreements (SLAs) are key for Voice traffic, Data traffic, Signaling, Management
  • The mobile equipment: configured to accept higher satellite delay
  • Mobile traffic: needs to be optimized to lower the cost of satellite bandwidth

 

Mobile traffic optimization

As mentioned above, traffic needs optimization, in order to make the most efficient use of the satellite bandwidth. To that avail, here are a few possible methods for each of the following telecommunication infrastructure types:

  • 2G TDM: remove silence and idle channels
    • Optimization on Abis can bring up to 50% bandwidth gain
    • Gains on Ater, A link and nb interface as well
  • 2G IP, 3G, 4G: leverage compression for headers (small packets), and payload (data)
  • VoLTE: compression of internal stacks as well (within GTP tunnel)

 

*Additional feature: accelerating TCP in 4G

  • TCP traffic  – captured transparently within GTP tunnel, by using “protocol spoofing”
  • A mechanism that tries to send as much data as possible, as soon as possible
  • The protocol uses the alternative window size mechanism, with a larger window – to increase throughput (RFC 1323)
  • The protocol implements alternative acknowledgment mechanism, reducing number of ACKs.

 

The importance and requirements of the satellite solution

Mobile network operators considered satellite backhaul as an attractive option for a long time. This is due to it being the only option in extreme situations, either permanent (remote or congested spots), or temporary (cataclysms, catastrophes, major downtime incidents). While it used to be a cost-prohibitive solution, the advances in technology now open up this option.

The new opportunities come with their own specific traits and requirements:

  • Spectral efficiency is essential in reducing the cost of satellite bandwidth
  • Quality of Service (QoS) is the key to being able to fulfil the mobile operators’ tight requirements
  • Flexibility is important. It accommodates the multiple mobile technologies (2G, 3G, 4G), and the various network configurations (including the rural low power consumption phenomenon)
  • The technology allows for bandwidth sharing and dynamic adjustment to real-time traffic and weather conditions
  • Scalability, which is critical for:
    • Typical mobile backhaul network: 20 to 100 remote sites
    • Typical small cell network: 1000 sites
    • Increasingly, mobile operators also want other services to be hosted on the same satellite solution, e.g. enterprise service, thus resulting more remotes

 

0

Tech

Starting from its own report, Automation World details this week on a phenomenon of interest for all owners and developers, whose activities are linked to the Industrial Internet of Things (IIoT). In what the online publication deems a key fact, it seems that

Manufacturers are mapping out a route that covers both (cloud & edge), each an integral on-ramp for new predictive maintenance and performance monitoring applications at different stops on the journey.

 

Where is IIoT taking place, according to the AW report?

Not so much taking place, but evolving, operating, being deployed with all its features, data flow, stream of communications – of course. However one would call it, here is what the numbers say:

  • 43 % of the respondents revealed they have edge computing implementations underway;
  • 51 % of the respondents already use cloud computing in their IIoT implementations
  • 20 % of the respondents mentioned fog computing, “ a superset of edge computing used to bridge the traditional operations (OT) world with enterprise IT” – according to AW
  • 29 % of the respondents marked their answer under the “none of the above section”.

While some of you may be momentarily surprised by the over 100 % total, this is due to the fact that the survey respondents could pick more than one answer.

 

The user’s maturity degree in using edge and cloud computing in IIoT, according to the same study

The angle here is that the amount of time since starting to use either one of the options, or both, entitles the users to a more firm and valid verdict when it comes to comparing the mediums they employ for IIoT.

  • 48 % of the users employ these technologies for more than 3 years
  • 39 % of the users hold an experience of over 2 years in edge/fog or cloud computing technology
  • 12 % of the respondents are still at the beginning with using these above-mentioned technologies

 

Which environment is predominant?

When asked which their most preferred path for production/manufacturing data analysis applications is, the survey respondents placed their preferences as such:

  • 52 % cited edge computing
  • under 20 % mentioned equipment data analytics—for capacity or overall equipment effectiveness

In what their “preferred paradigm for enterprise effectiveness analysis” is concerned,

  • 5 % chose cloud computing
  • 18 % went for edge computing
  • 27 % mentioned fog computing

 

The relationship between the types of IIoT deployment and the overall benefits

Since their experience with one or the other of the deployment environments is around 2-3 years, the surveyed companies could provide an image of the overall benefits:

  • 50 % of the responding companies vouched for significant reductions in downtime
  • 38 % mentioned “measurable improvements to production output”
  • 37 % witnessed a considerable profitability increase
  • 30 % mentioned a decrease in the production costs

A typical rollout pattern involves a cloud – edge – cloud again (mixed version) for the IIoT deployment operations.

For other relevant numbers, as well as for perspectives in the related hardware field, you may access the source Automated World article here.

 

 

 

 

0

News

A couple of days ago, Microsoft launched a series of updates that benefit the developers. At their Ignite event (Orlando, FL), the company revealed the new and improved features in many of its product lines, in theme with their AI & ML focus for 2018.

We browsed TechCrunch’s report of the notable developer-centric updates.

 

For the Microsoft Azure Machine Learning services

The selection, testing and tweaking processes become mostly automated. This way the developers can save important time. They are also able to build, “without having to delve into the depths of TensorFlow, PyTorch or other AI frameworks.”

Also, more hardware-accelerated models for FPGAs will be available from now on.(*FPGAs – powerful field programmable gate arrays, used “for high-speed image classification and recognition scenarios on Azure”)

Microsoft decided to add a Python SDK to the mix, too, making Azure Machine Learning more accessible from a wider variety of devices. According to Microsoft, this SDK “integrates the Azure Machine Learning service with Python development environments including Visual Studio Code, PyCharm, Azure Databricks notebooks and Jupyter notebooks”. It also lights up a number of different features, such as “deep learning, which enables developers to build and train models faster with massive clusters of graphical processing units, or GPUs” and link into the FPGAs mentioned above.

 

For the Microsoft Azure Cognitive Services


The Microsoft speech service for speech recognition and translation also features an upgrade. The improvements are in terms of voice quality, as well as availability.

Apparently, now the voices generated by the Azure Cognitive Services’ deep learning-based speech synthesis are true to life.

 

For the Microsoft Bot Framework SDK

The company declared that now it’s much easier for any developer to build its first bot.

*However, TechCrunch’s comment for this reminds us that the bots hype is a thing of the past at the moment. Therefore, it remains to be seen how many developers will take advantage of the new features. Nevertheless, more natural human – computer interactions are now available on the Microsoft Bot Framework SDK.

 

The automated model selection and tuning of so-called hyperparameters that govern the performance of machine learning models that are part of automated machine learning will make AI development available to a broader set of Microsoft’s customer

– Eric Boyd, corporate vice president, AI Platform, Microsoft

Details here

0

Tech

The LASTING Software Big Data experience spans from pharmaceutical industry projects, to the business software and consultancy collaborations we are part of. Digital data is our field of action.

While auditing companies and being part of the preliminary consultancy stage, one can notice in time how the needs and requirements of each partner vary, even if the ultimate goals bring us all together. Staying ahead of the competition, getting a firm grasp on the industrial revolution phenomenon and keeping or making their organizations successful are all common traits for companies, even for industries.

Reaching these goals allows for a wide range of software solutions to come into play. The best solution for you is the one that provides the most suitable, actionable answer to your current needs. Being scalable is a big extra, because what you invest in now will also deliver tomorrow, the next year or in 5 years. Provided you specify this in the consultancy & design phase, a sustainable software solution should always be an integrated scope.

On this line, we ran into an article that deconstructs the Big Data needs – and we want to share a few ideas out of it with you.

 

 

Operational data, a preliminary step to Big Data

While Big Data involves a great deal of external data purchasing, operational data is internal. The ReadWrite article that caught our attention underlines the importance of harnessing this type of data, before making the leap for Big Data.

Mining data for actionable information requires attentive management, accurate analysis, and continuous adjustment, and buying software and raw data doesn’t provide companies with the skills necessary to master these processes overnight”.

In other words, the companies that are new to this need guidance. So do those that have reached the conclusion that something needs to change for the better in the way they do things.

This, as well as trying your hand at the data game by processing operational data, could both use the right partnerships.

The consultants in this field put their know-how and experience at your disposal. Once you successfully deploy operational data solutions, it may be time for big data, or not. Often the optimal solution combines both of the types of data. It all depends on the specifics of your activity.

 

The smaller-scale solutions for digital data, between creativity and customization

Let’s say your organization has a certain operational need. The answer takes the shape of a potential project. Creativity points out a certain type of solution, perhaps even a completely innovative one. When further going into the matter, some aspects become limited by technical elements. You customize down, so to say. What you imagined at a first glance can change. It happens, once confronted with budgeting, resources, compatibility, and project duration or other reasonable expectations.

When done in-house, this stage can be too harsh, or in some cases, not harsh enough. Both are dangerous, because instead of getting things done the right way, you would either settle for a more limited solution, or delay the problem solving by chasing a less realistic projection. By partnering with a software solutions company, you can test these “could be”-s against a backdrop of hands-on experience, industry know how and updated knowledge in the software solutions field.

The second stage of customization can take place when already having a certain skilled software partner. Your operational need can prove one that is familiar across your industry or field. Then the answer to it need not start from scratch. Building upon an already validated solution involves customizing up and the team creates and/or activates new features. Cost-saving and faster, this way of getting a personalized solution is perhaps the most frequent nowadays.

Contact LASTING Software, let’s talk about this and see where it takes us! We are here for your projects!

 

0

Tech

5G links to Smart Manufacturing, which links to Industry 4.0. In fact the last two may be seen as one and the same thing.


Smart manufacturing and the Smart Factory is a broad category of manufacturing with the goal of optimizing the manufacturing process. Smart manufacturing is the process that employs computer controls, modeling, big data and other automation to improve manufacturing efficiencies.

Source


Although some big players are still reticent, important 5G promoters continue to develop this technology. Moreover, hands-on demos illustrate its key qualities. Continuous process monitoring is one of the areas 5G should bring shorter lead-times, while returning higher yield.

We took a look at one article occasioned by this year’s Industrial Manufacturing Show held in Chicago:


Industry 4.0 will rely on increasingly fast, secure, and often wireless, data transfers to optimize operations, increase automation and mitigate risk in manufacturing environments.

 

Potential 5G early adopters


Various sensitive industries that would highly benefit from a reduction in the production errors are potential early adopters.


The 5G article mentions the aeronautical industry and the manufacturing industry – when focusing on the production of sophisticate components.


The scope here is to make the collecting and processing of the sensor data, as well as the action of cutting control, take place in under one millisecond.


Research has pointed out that for this type of demands, “5G and cellular wireless can deliver, with its ultra-low latency and high reliability”.

 

What does 5G mean for industrial production?


As mentioned above, no network solutions before 5G have been able to provide instant monitoring and adjustment during production.


An extra benefit is that by storing production and sensor data for each manufactured product, the 5G supported manufacturing enables real-time digital twins. Whenever needed, this data is available and ready to serve in research, analysis, and risk assessment, or to answer other possible demands.

 

While waiting for 5G future developments


As our colleague Ionel Petrut, Project Manager at Lasting Software, mentioned in his presentation held this year for the SACI conference (IEEE 12th International Symposium on Applied Computational Intelligence and Informatics, SACI 2018, Timisoara, Romania):

 

The Internet of Things had a very big impact on the technology as we know today. One important aspect, not yet standardized, is the way in which objects connect to the network to function together. In particular, in the medical field, the Internet of Things faces a high number of technical limitations, most of them related to the electromagnetic emissions.

Until the 5G network will create the infrastructure for sensors to directly connect to the Internet, the connection through Bluetooth Low Energy remains the most suitable solution for Internet of Things connectivity.

 

We prove our software engineering skills and professional commitment by mastering the current technologies involved in the projects we are part of. But we also keep an eye on the latest developments, dreaming of the bold new world of fully deployed IoT capabilities.

 

Contact us for consultancy and partnerships – the future awaits!

 

0

PREVIOUS POSTSPage 3 of 6NEXT POSTS