Web development is the work involved in developing a website for the Internet (World Wide Web) or an intranet (a private network). Web development can range from developing a simple single static page of plain text to complex web applications, electronic businesses, and social network services. A more comprehensive list of tasks to which Web development commonly refers, may include Web engineering, Web design, Web content development, client liaison, client-side/server-side scripting, Web server and network security configuration, and e-commerce development. Among Web professionals, "Web development" usually refers to the main non-design aspects of building Web sites: writing markup and coding. Web development may use content management systems (CMS) to make content changes easier and available with basic technical skills. For larger organizations and businesses, Web development teams can consist of hundreds of people (Web developers) and follow standard methods like Agile methodologies while developing Web sites. Smaller organizations may only require a single permanent or contracting developer, or secondary assignment to related job positions such as a graphic designer or information systems technician.
Web development may be a collaborative effort between departments rather than the domain of a designated department. There are three kinds of Web developer specialization: front-end developer, back-end developer, and full-stack developer. Front-end developers are responsible for behavior and visuals that run in the user browser, while back-end developers deal with the servers. Since the commercialization of the Web, the industry has boomed and has become one of the most used technologies ever. Tim Berners-Lee created the World Wide Web in 1989 at CERN. The primary goal in the development of the Web was to fulfill the automated information-sharing needs of academics affiliated with institutions and various global organizations. Web 1.0 is described as the first paradigm wherein users could only view material and provide a small amount of information. Core protocols of web 1.0 were HTTP, HTML and URI. Web 2.0, a term popularised by Dale Dougherty, then vice president of O'Reilly, during a 2004 conference with Media Live, marks a shift in internet usage, emphasizing interactivity.
Web 2.0 introduced increased user engagement and communication. It evolved from the static, read-only nature of Web 1.0 and became an integrated network for engagement and communication. It is often referred to as a user-focused, read-write online network. In the realm of Web 2.0 environments, users now have access to a platform that encourages sharing activities such as creating music, files, images, and movies. The architecture of Web 2.0 is often considered the "backbone of the internet," using standardized XML (Extensible Markup Language) tags to authorize information flow from independent platforms and online databases. Web 3.0, considered the third and current version of the web, was introduced in 2014. The concept envisions a complete redesign of the web. Key features include the integration of metadata, precise information delivery, and improved user experiences based on preferences, history, and interests. Web 3.0 aims to turn the web into a sizable, organized database, providing more functionality than traditional search engines. Users can customize navigation based on their preferences, and the core ideas involve identifying data sources, connecting them for efficiency, and creating user profiles.
This version is sometimes also known as Semantic Web. The journey of web development technologies began with simple HTML pages in the early days of the internet. Over time, advancements led to the incorporation of CSS for styling and JavaScript for interactivity. This evolution transformed static websites into dynamic and responsive platforms, setting the stage for the complex and feature-rich web applications we have today. Web development in future will be driven by advances in browser technology, Web internet infrastructure, protocol standards, software engineering methods, and application trends. The web development life cycle is a method that outlines the stages involved in building websites and web applications. It provides a structured approach, ensuring optimal results throughout the development process. Debra Howcraft and John Carroll proposed a methodology in which web development process can be divided into sequential steps. They mentioned different aspects of analysis. Phase one involves crafting a web strategy and analyzing how a website can effectively achieve its goals.
To mitigate these risks, Phase One establishes strategic goals and objectives, designing a system to fulfill them. The decision to establish a web presence should ideally align with the organization's corporate information strategy. During this phase, the previously outlined objectives and available resources undergo analysis to determine their feasibility. Technology analysis: Identification of all necessary technological components and tools for constructing, hosting, and supporting the site. Information analysis: Identification of user-required information, whether static (web page) or dynamic (pulled "live" from a database server). Skills analysis: Identification of the diverse skill sets necessary to complete the project. User analysis: Identification of all intended users of the site, a more intricate process due to the varied range of users and technologies they may use. Cost analysis: Estimation of the development cost for the site or an evaluation of what is achievable within a predefined budget. Risk analysis: Examination of any major risks associated with site development.
|