Aligned Technology
Consultants

We help our clients

Navigate

the technology landscape

Evaluate

the myriad of options available

Choose

the ones that best suit them

Focus

on what they do best

Computers

A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users.

Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries.

Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era touchscreen). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved.

Mobile Devices

Device mobility can be viewed in the context of several qualities:[2]

Physical dimensions and weight Whether the device is mobile or some kind of host to which it is attached is mobile What kind of host devices it can be bound with How devices communicate with a host When mobility occurs

Strictly speaking, many so-called mobile devices are not mobile. It is the host that is mobile, i.e., a mobile human host carries a non-mobile smartphone device. An example of a true mobile computing device, where the device itself is mobile, is a robot. Another example is an autonomous vehicle.

There are three basic ways mobile devices can be physically bound to mobile hosts:

Accompanied, Surface-mounted, or Embedded into the fabric of a host, e.g., an embedded controller in a host device.

Accompanied refers to an object being loosely bound and accompanying a mobile host, e.g., a smartphone can be carried in a bag or pocket but can easily be misplaced.[2] Hence, mobile hosts with embedded devices such as an autonomous vehicle can appear larger than pocket-sized.

The most common size of a mobile computing device is pocket-sized, but other sizes for mobile devices exist. Mark Weiser, known as the father of ubiquitous computing,[3] referred to device sizes that are tab-sized, pad, and board sized,[4] where tabs are defined as accompanied or wearable centimeter-sized devices, e.g. smartphones, phablets and tablets are defined as hand-held decimeter-sized devices. If one changes the form of the mobile devices in terms of being non-planar, one can also have skin devices and tiny dust-sized devices.[2]

Dust refers to miniaturized devices without direct HCI interfaces, e.g., micro-electromechanical systems (MEMS), ranging from nanometers through micrometers to millimeters. See also Smart dust. Skin: fabrics based upon light emitting and conductive polymers and organic computer devices. These can be formed into more flexible non-planar display surfaces and products such as clothes and curtains, see OLED display. Also, see smart device.

Although mobility is often regarded as synonymous with having wireless connectivity, these terms are different. Not all network access by mobile users, applications, and devices needs to be via wireless networks and vice versa. Wireless access devices can be static and mobile users can move between wired and wireless hotspots such as in Internet cafés.[2] Some mobile devices can be used as mobile Internet devices to access the Internet while moving, but they do not need to do this and many phone functions or applications are still operational even while disconnected from the Internet. What makes the mobile device unique compared to other technologies is the inherent flexibility in the hardware and software. Flexible applications include video chat, web browsing, payment systems, near field communication, audio recording etc.[5] As mobile devices become ubiquitous, there will be an increase of services which include the use of the cloud.[6] Although a common form of mobile device, a smartphone, has a display, another perhaps even more common form of smart computing device, the smart card, e.g., used as a bank card or travel card, does not have a display. This mobile device often has a CPU and memory but needs to connect or be inserted into a reader to display its internal data or state.

Software

Building on previous innovations in mathematics and technology, software was created for the programmable digital computers that emerged in the late 1940s and was necessary to realize their usefulness. The first software was tied closely to the underlying computer hardware, but over time, the lower layers of the system have become more standardized, and software has become increasingly portable between different systems and abstracted from the underlying machine code. Operating systems manage the hardware resources and mediate between different applications that accomplish tasks for the user. Programming languages are the format in which software is written, and must be both human-readable and capable of being translated into unambiguous instructions for computer hardware. Compilers or interpreters are needed to link a program with other code that it relies on and convert the software into machine code that can be executed on the hardware. Programs are combined with each other and with external input to be capable of accomplishing a complex task.

Software development's central task is programming and maintaining a project's source code, but the term also covers conceiving the project, evaluating its feasibility, analyzing the business requirements, software design, and release. Software quality assurance, including code review and testing, is an essential part of the process, as is delivering quality code lowers the cost of reliability failures, cyberattacks enabled by security vulnerabilities, and maintenance costs. Maintenance typically consumes 75 percent or more of the software's lifetime engineering budget. Source code is protected by copyright law, which vests the owner with the exclusive right to copy the code. Software has become ubiquitous in everyday life in developed countries. In many cases, software augments the functionality of pre-existing technologies, but it has also enabled the creation of entirely new technologies such as the Internet, video games, social media, mobile phones, and GPS.

Internet

The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.

The origins of the Internet date back to research that enabled the time-sharing of computer resources and the development of packet switching in the 1960s which led to the design of computer networks for data communication.[2][3] The set of rules (communication protocols) to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France.[4][5][6] The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. The funding of the National Science Foundation Network as a new backbone in the 1980s, as well as private funding for other commercial extensions, encouraged worldwide participation in the development of new networking technologies and the merger of many networks using DARPA's Internet protocol suite.[7] The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web,[8] marked the beginning of the transition to the modern Internet,[9] and generated sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network. Although the Internet was widely used by academia in the 1980s, the subsequent commercialization in the 1990s and beyond incorporated its services and technologies into virtually every aspect of modern life.

Most traditional communication media, including telephone, radio, television, paper mail, and newspapers, are reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephone, Internet television, online music, digital newspapers, and video streaming websites. Newspapers, books, and other print publishing have adapted to website technology or have been reshaped into blogging, web feeds, and online news aggregators. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has grown exponentially for major retailers, small businesses, and entrepreneurs, as it enables firms to extend their "brick and mortar" presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries.

The Internet has no single centralized governance in either technological implementation or policies for access and usage; each constituent network sets its own policies.[10] The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise.[11] In November 2006, the Internet was included on USA Today's list of the New Seven Wonders.[12]

Services

Software and hardware evaluation

Are you using the right software and hardware to serve your business needs?

Software configuration and deployment

Are you tweaking and optimizing?

Security and Privacy Hardening

Make sure your surface area is as small as it can be

Web browsing best practices

There is a lot to be aware of about your Web Browser

Software and hardware evaluation

Performance testing is generally executed to determine how a system or sub-system performs in terms of responsiveness and stability under a particular workload. It can also serve to investigate, measure, validate or verify other quality attributes of the system, such as scalability, reliability and resource usage.

Load testing is primarily concerned with testing that the system can continue to operate under a specific load, whether that be large quantities of data or a large number of users. This is generally referred to as software scalability. The related load testing activity of when performed as a non-functional activity is often referred to as endurance testing. Volume testing is a way to test software functions even when certain components (for example a file or database) increase radically in size. Stress testing is a way to test reliability under unexpected or rare workloads. Stability testing (often referred to as load or endurance testing) checks to see if the software can continuously function well in or above an acceptable period.

There is little agreement on what the specific goals of performance testing are. The terms load testing, performance testing, scalability testing, and volume testing, are often used interchangeably.

Real-time software systems have strict timing constraints. To test if timing constraints are met, real-time testing is used. Usability testing

Usability testing is to check if the user interface is easy to use and understand. It is concerned mainly with the use of the application. This is not a kind of testing that can be automated; actual human users are needed, being monitored by skilled UI designers.

Accessibility testing

Accessibility testing is done to ensure that the software is accessible to persons with disabilities. Some of the common web accessibility tests are

Ensuring that the color contrast between the font and the background color is appropriate Font Size Alternate Texts for multimedia content Ability to use the system using the computer keyboard in addition to the mouse.

Common Standards for compliance

Americans with Disabilities Act of 1990 Section 508 Amendment to the Rehabilitation Act of 1973 Web Accessibility Initiative (WAI) of the World Wide Web Consortium (W3C)

Security testing

Security testing is essential for software that processes confidential data to prevent system intrusion by hackers.

The International Organization for Standardization (ISO) defines this as a "type of testing conducted to evaluate the degree to which a test item, and associated data and information, are protected so that unauthorised persons or systems cannot use, read or modify them, and authorized persons or systems are not denied access to them."

Software configuration and deployment

Some operating system designs are more secure than others. Those with no isolation between the kernel and applications are least secure, while those with a monolithic kernel like most general-purpose operating systems are still vulnerable if any part of the kernel is compromised. A more secure design features microkernels that separate the kernel's privileges into many separate security domains and reduce the consequences of a single kernel breach.[116] Unikernels are another approach that improves security by minimizing the kernel and separating out other operating systems functionality by application.[116]

Most operating systems are written in C or C++, which create potential vulnerabilities for exploitation. Despite attempts to protect against them, vulnerabilities are caused by buffer overflow attacks, which are enabled by the lack of bounds checking.[117] Hardware vulnerabilities, some of them caused by CPU optimizations, can also be used to compromise the operating system.[118] There are known instances of operating system programmers deliberately implanting vulnerabilities, such as back doors.[119]

Operating systems security is hampered by their increasing complexity and the resulting inevitability of bugs.[120] Because formal verification of operating systems may not be feasible, developers use operating system hardening to reduce vulnerabilities,[121] e.g. address space layout randomization, control-flow integrity,[122] access restrictions,[123] and other techniques.[124] There are no restrictions on who can contribute code to open source operating systems; such operating systems have transparent change histories and distributed governance structures.[125] Open source developers strive to work collaboratively to find and eliminate security vulnerabilities, using code review and type checking to expunge malicious code.[126][127] Andrew S. Tanenbaum advises releasing the source code of all operating systems, arguing that it prevents developers from placing trust in secrecy and thus relying on the unreliable practice of security by obscurity.[128]

Security and Privacy Hardening

Security by design Security by design, or alternately secure by design, means that the software has been designed from the ground up to be secure. In this case, security is considered a main feature.

Security architecture Security architecture can be defined as the "practice of designing computer systems to achieve security goals."[51] These goals have overlap with the principles of "security by design" explored above, including to "make initial compromise of the system difficult," and to "limit the impact of any compromise."[51] In practice, the role of a security architect would be to ensure the structure of a system reinforces the security of the system, and that new changes are safe and meet the security requirements of the organization.[52][53]

Security measures A state of computer security is the conceptual ideal, attained by the use of three processes: threat prevention, detection, and response. These processes are based on various policies and system components, which include the following: Limiting the access of individuals using user account access controls and using cryptography can protect systems files and data, respectively. Firewalls are by far the most common prevention systems from a network security perspective as they can (if properly configured) shield access to internal network services and block certain kinds of attacks through packet filtering. Firewalls can be both hardware and software-based. Firewalls monitor and control incoming and outgoing traffic of a computer network and establish a barrier between a trusted network and an untrusted network.[55] Intrusion Detection System (IDS) products are designed to detect network attacks in-progress and assist in post-attack forensics, while audit trails and logs serve a similar function for individual systems. Response is necessarily defined by the assessed security requirements of an individual system and may cover the range from simple upgrade of protections to notification of legal authorities, counter-attacks, and the like. In some special cases, the complete destruction of the compromised system is favored, as it may happen that not all the compromised resources are detected. Cyber security awareness training to cope with cyber threats and attacks.[56] Forward web proxy solutions can prevent the client to visit malicious web pages and inspect the content before downloading to the client machines.

Web browsing best practices

Web navigation refers to the process of navigating a network of information resources in the World Wide Web, which is organized as hypertext or hypermedia.[1] The user interface that is used to do so is called a web browser.[2]

A central theme in web design is the development of a web navigation interface that maximizes usability.

A website overall navigational scheme includes several navigational pieces such as global, local, supplemental, and contextual navigation; all of these are vital aspects of the broad topic of web navigation.[3] Hierarchical navigation systems are vital as well since it is the primary navigation system. It allows for the user to navigate within the site using levels alone, which is often seen as restricting and requires additional navigation systems to better structure the website.[4] The global navigation of a website, as another segment of web navigation, serves as the outline and template in order to achieve an easy maneuver for the users accessing the site, while local navigation is often used to help the users within a specific section of the site.[3] All these navigational pieces fall under the categories of various types of web navigation, allowing for further development and for more efficient experiences upon visiting a webpage.

Contact Us

For more information send us a message