sg是什么意思| 脚有酸味是什么原因| 去医院打耳洞挂什么科| 经常口腔溃疡挂什么科| 氨纶是什么面料优缺点| 肺动脉增宽是什么意思| 早上五点是什么时辰| 骨密度低吃什么药最快| 今天是什么年| 喝枸杞有什么好处| 大豆油是什么豆做的| 叔叔老婆叫什么| 2006属什么生肖| 肌酐测定是查什么| 肌底液是干什么用的| 爷爷的爸爸叫什么| 男人为什么离不开情人| 盆腔炎挂什么科| 寂静的意思是什么| 喉咙有异物感是什么原因| 虾线是什么| 当逃兵会有什么后果| 铁锈用什么能洗掉| 印第安人是什么人种| 小孩指甲有白点是什么原因| 右耳朵疼是什么原因| 6月13是什么星座| 马上风是什么意思| 鼓动是什么意思| 恩五行属什么| bmd是什么意思| 腹泻吃什么好| 胎位loa是什么位置| 女性尿道出血是什么原因引起的| dex是什么药| 地龙是什么动物| 祛痣挂什么科| 今年七夕节是什么时候| 福寿螺为什么不能吃| 公知是什么意思| 女人吃什么养颜又美白| 嘴巴苦吃什么药| 北海龙王叫什么| 丹凤朝阳什么意思| 梦见灵堂是什么预兆| sharon是什么意思| 肛门瘙痒看什么科| 石人工念什么| 肝属于五行中的什么| 什么人容易长智齿| 为什么头发会分叉| 咽炎吃什么药最好效果| 看守所和拘留所有什么区别| 低钾会出现什么症状| 什么叫血糖| 内分泌失调有什么症状| 知了猴什么时候结束| 微创人流和无痛人流有什么区别| 皮肤一块白一块白的是什么原因| 大地色眼影是什么颜色| brat什么意思| 九月十四号是什么星座| 避孕套什么牌子好用又安全| 一什么明珠| 肠系膜淋巴结炎吃什么药最有效| 血糖高是什么症状| 东南方是什么生肖| 什么药止汗效果最好| 减肥可以喝什么饮料| 珍珠是用什么做的| 什么是小暑| 兰桂齐芳是什么意思| 白羊歌词是什么意思| 老花镜什么品牌好| 什么食用油最好最健康| 阴茎出血是什么原因| 宫寒是什么| 阿莫西林主要治疗什么| 肺纤维化有什么症状| 绒毛浆是什么| 白介素6升高说明什么| 屁很臭是什么原因| 批号是什么意思| 药店为什么不让卖高锰酸钾| 外出是什么意思| c60是什么| 吃的少还胖什么原因| dht是什么意思| 总是感觉口渴是什么原因| 肠道消炎用什么药最好| 小猫的尾巴有什么作用| 一个壳一个心念什么| 坐东朝西是什么意思| 除权是什么意思| 清考是什么意思| 摩羯座的幸运色是什么| 小腿为什么会抽筋| 2004年属什么生肖| 治疗风湿有什么好方法| 什么茶| 拍胸片挂什么科室| 什么人容易得胆汁淤积| 外阴瘙痒用什么药好| 庄周梦蝶什么意思| 抗hbc阳性是什么意思| 嗓子吞咽疼痛吃什么药| 儿童鼻炎吃什么药| 心里不舒服是什么原因| 液体套是什么| 身上臭是什么原因| 妖是什么意思| 心电图电轴右偏是什么意思| 丈二和尚摸不着头脑是什么意思| 骚扰是什么意思| 尿胆原阳性是什么意思| 耳石症挂什么科| 三个降号是什么调| 警察两杠一星是什么级别| 1977属什么| 皮肤白斑是什么原因| 黄芪精适合什么人喝| 活动性肺结核是什么意思| 81年五行属什么| 任性妄为是什么意思| 宝宝便秘吃什么好| 水厄痣是什么意思| 晚上10点属于什么时辰| 梅菜扣肉的梅菜是什么菜| 蛇和什么属相最配| 荷花代表什么生肖| 才高八斗代表什么生肖| 夏天吃什么水果最好| 一生无虞是什么意思| 吃什么可以抗衰老| 蜘蛛代表什么生肖| 11月30号什么星座| 大人是什么意思| 胃酸吃什么药好| 化验血能查出什么项目| 掂过碌蔗是什么意思| 小孩腹泻吃什么药好得快| 青梅什么季节成熟| 什么样的女人不能娶| 中医七情指的是什么| 蓝得什么| 鸽子拉水便是什么原因| 雌激素分泌过多是什么原因引起的| 又吐又拉是什么原因| absorb什么意思| 阴部瘙痒用什么药| 什么是五常大米| 贫血有什么症状| 副区长是什么级别| 鱼跃龙门是什么意思| 念珠菌是什么| 无名指为什么叫无名指| 5201314是什么意思| 5月20日什么星座| 突然高血压是什么原因引起的| 八面玲珑是什么数字| 包饺子剩下的面团能做什么| 出水痘不能吃什么食物| 颔是什么部位| 小便有泡沫是什么情况| 时点是什么意思| 柔和是什么意思| 羊肉和什么食物相克| 气管痉挛是什么症状| 夫妻都是b型血孩子是什么血型| 沙龙会是什么意思| cn什么意思| 枯木逢春什么意思| 刚生完孩子可以吃什么水果| 失常是什么意思| 澳大利亚的国宝是什么| 经常掉头发是什么原因| 最大的淡水湖是什么湖| 子宫肌瘤伴钙化是什么意思| 朔望月是什么意思| 卵巢增大是什么原因引起的| 一切尽在不言中是什么意思| 福不唐捐什么意思| 浪琴表属于什么档次| 什么雨| 脂肪肝能吃什么水果| 偶发房性早搏是什么意思| 拔鼻毛有什么危害| 生肖蛇和什么生肖相冲| 夏天吹什么风| 凉爽的什么| 总胆红素偏高是什么意思| 狭隘是什么意思| 吃什么降三高最好| 为什么会打哈欠| 60是什么意思| 宫寒是什么原因引起的如何调理| cl是什么元素| 文员是什么| 体面什么意思| 心理卫生科看什么病的| 什么人不建议吃海参| 孕激素低吃什么补得快| 自愈什么意思| 油是什么意思| 未退化胸腺是什么意思| 耳朵听不清楚是什么原因| 象是什么结构| 什么叫肝功能不全| 阴茎越来越小是什么原因| 等闲变却故人心却道故人心易变什么意思| 副县长是什么级别干部| 羊蝎子是什么肉| 血涂片检查什么病| 第一次世界大战是什么时候| 什么得直什么| 太子龙男装什么档次| 颈椎钙化是什么意思严重么| 披什么散什么| 太虚幻境是什么意思| 尿道感染吃什么药好得快| 胃胀气吃什么| 纯化水是什么水| 上嘴唇长痘痘是什么原因| 双脚发热是什么原因| 大连焖子是什么做的| 吃什么补肾虚| 眼睛肿是什么原因引起的| 甲亢吃什么好的更快| 血红蛋白什么意思| 土豆是什么科| 圆脸适合什么发型| pretty什么意思| 时值是什么意思| 预防医学是什么| 头疼是什么引起的| 世界上最贵的狗是什么| 13年属什么生肖| 脖子后面疼是什么原因| 腺体增生是什么意思| 合肥属于什么省| 人参果总皂苷是什么| 狗狗发烧吃什么药| 小排畸什么时候做| 牙囊肿是什么病严重吗| 惊厥是什么病| 西瓜红是什么颜色| 什么关系| 恐龙为什么会灭绝| 国药准字号是什么意思| 肚子痛去医院挂什么科| 驾驶证和行驶证有什么区别| 女性得了性病有什么症状| 夹腿什么意思| 男性手心热是什么原因| 拧巴什么意思| 床上什么虫子夜间咬人| 名节是什么意思| body是什么意思| 什么是正月| 1967年出生属什么| 不排大便是什么原因| 望远镜10x50什么意思| 慢性萎缩性胃炎伴糜烂吃什么药| 今天属什么生肖老黄历| 百度

韩媒:忧中国因“萨德”限制赴韩游 韩免税店惴惴不安

(Redirected from Content-control software)
百度 在具体措施层面,移动源污染防治列首位,其中,首要防治的,就是重型柴油车这类高排放车污染。

An Internet filter is a type of internet censorship that restricts or controls the content an Internet user is capable to access, especially when utilized to restrict material delivered over the Internet via the Web, Email, or other means. Such restrictions can be applied at various levels: a government can attempt to apply them nationwide (see Internet censorship), or they can, for example, be applied by an Internet service provider to its clients, by an employer to its personnel, by a school to its students, by a library to its visitors, by a parent to a child's computer, or by an individual user to their own computers. The motive is often to prevent access to content which the computer's owner(s) or other authorities may consider objectionable. When imposed without the consent of the user, content control can be characterised as a form of internet censorship. Some filter software includes time control functions that empowers parents to set the amount of time that child may spend accessing the Internet or playing games or other computer activities.

Terminology

edit

The term "content control" is used on occasion by CNN,[1] Playboy magazine,[2] the San Francisco Chronicle,[3] and The New York Times.[4] However, several other terms, including "content filtering software", "web content filter", "filtering proxy servers", "secure web gateways", "censorware", "content security and control", "web filtering software", "content-censoring software", and "content-blocking software", are often used. "Nannyware" has also been used in both product marketing and by the media. Industry research company Gartner uses "secure web gateway" (SWG) to describe the market segment.[5]

Companies that make products that selectively block Web sites do not refer to these products as censorware, and prefer terms such as "Internet filter" or "URL Filter"; in the specialized case of software specifically designed to allow parents to monitor and restrict the access of their children, "parental control software" is also used. Some products log all sites that a user accesses and rates them based on content type for reporting to an "accountability partner" of the person's choosing, and the term accountability software is used. Internet filters, parental control software, and/or accountability software may also be combined into one product.

Those critical of such software, however, use the term "censorware" freely: consider the Censorware Project, for example.[6] The use of the term "censorware" in editorials criticizing makers of such software is widespread and covers many different varieties and applications: Xeni Jardin used the term in a 9 March 2006 editorial in The New York Times, when discussing the use of American-made filtering software to suppress content in China; in the same month a high school student used the term to discuss the deployment of such software in his school district.[7][8]

In general, outside of editorial pages as described above, traditional newspapers do not use the term "censorware" in their reporting, preferring instead to use less overtly controversial terms such as "content filter", "content control", or "web filtering"; The New York Times and The Wall Street Journal both appear to follow this practice. On the other hand, Web-based newspapers such as CNET use the term in both editorial and journalistic contexts, for example "Windows Live to Get Censorware."[9]

Types of filtering

edit

Filters can be implemented in many different ways: by software on a personal computer, via network infrastructure such as proxy servers, DNS servers, or firewalls that provide Internet access. No solution provides complete coverage, so most companies deploy a mix of technologies to achieve the proper content control in line with their policies.

Browser based filters

edit
Browser based content filtering solution is the most lightweight solution to do the content filtering, and is implemented via a third party browser extension.

E-mail filters

edit
E-mail filters act on information contained in the mail body, in the mail headers such as sender and subject, and e-mail attachments to classify, accept, or reject messages. Bayesian filters, a type of statistical filter, are commonly used. Both client and server based filters are available.

Client-side filters

edit
This type of filter is installed as software on each computer where filtering is required.[10][11] This filter can typically be managed, disabled or uninstalled by anyone who has administrator-level privileges on the system. A DNS-based client-side filter would be to set up a DNS Sinkhole, such as Pi-Hole.

Content-limited (or filtered) ISPs

edit
Content-limited (or filtered) ISPs are Internet service providers that offer access to only a set portion of Internet content on an opt-in or a mandatory basis. Anyone who subscribes to this type of service is subject to restrictions. The type of filters can be used to implement government,[12] regulatory[13] or parental control over subscribers.

Network-based filtering

edit
This type of filter is implemented at the transport layer as a transparent proxy, or at the application layer as a web proxy.[14] Filtering software may include data loss prevention functionality to filter outbound as well as inbound information. All users are subject to the access policy defined by the institution. The filtering can be customized, so a school district's high school library can have a different filtering profile than the district's junior high school library.

DNS-based filtering

edit
This type of filtering is implemented at the DNS layer and attempts to prevent lookups for domains that do not fit within a set of policies (either parental control or company rules). Multiple free public DNS services offer filtering options as part of their services. DNS Sinkholes such as Pi-Hole can be also be used for this purpose, though client-side only.[15]

Search-engine filters

edit
Many search engines, such as Google and Bing offer users the option of turning on a safety filter. When this safety filter is activated, it filters out the inappropriate links from all of the search results. If users know the actual URL of a website that features explicit or adult content, they have the ability to access that content without using a search engine. Some providers offer child-oriented versions of their engines that permit only child friendly websites.[16]

Parental controls

edit
Some ISPs offer parental control options. Some offer security software which includes parental controls. Mac OS X v10.4 offers parental controls for several applications (Mail, Finder, iChat, Safari & Dictionary). Microsoft's Windows Vista operating system also includes content-control software.

Reasons for filtering

edit

The Internet does not intrinsically provide content blocking, and therefore there is much content on the Internet that is considered unsuitable for children, given that much content is given certifications as suitable for adults only, e.g. 18-rated games and movies.

Internet service providers (ISPs) that block material containing pornography, or controversial religious, political, or news-related content en route are often utilized by parents who do not permit their children to access content not conforming to their personal beliefs. Content filtering software can, however, also be used to block malware and other content that is or contains hostile, intrusive, or annoying material including adware, spam, computer viruses, worms, trojan horses, and spyware.

Most content control software is marketed to organizations or parents. It is, however, also marketed on occasion to facilitate self-censorship, for example by people struggling with addictions to online pornography, gambling, chat rooms, etc. Self-censorship software may also be utilised by some in order to avoid viewing content they consider immoral, inappropriate, or simply distracting. A number of accountability software products are marketed as self-censorship or accountability software. These are often promoted by religious media and at religious gatherings.[17]

Technology

edit

Content filtering technology exists in two major forms: application gateway or packet inspection. For HTTP access the application gateway is called a web-proxy or just a proxy. Such web-proxies can inspect both the initial request and the returned web page using arbitrarily complex rules and will not return any part of the page to the requester until a decision is made. In addition they can make substitutions in whole or for any part of the returned result. Packet inspection filters do not initially interfere with the connection to the server but inspect the data in the connection as it goes past, at some point the filter may decide that the connection is to be filtered and it will then disconnect it by injecting a TCP-Reset or similar faked packet. The two techniques can be used together with the packet filter monitoring a link until it sees an HTTP connection starting to an IP address that has content that needs filtering. The packet filter then redirects the connection to the web-proxy which can perform detailed filtering on the website without having to pass through all unfiltered connections. This combination is quite popular because it can significantly reduce the cost of the system.

There are constraints to IP level packet-filtering, as it may result in rendering all web content associated with a particular IP address inaccessible. This may result in the unintentional blocking of legitimate sites that share the same IP address or domain. For instance, university websites commonly employ multiple domains under one IP address. Moreover, IP level packet-filtering can be surpassed by using a distinct IP address for certain content while still being linked to the same domain or server.[18]

Gateway-based content control software may be more difficult to bypass than desktop software as the user does not have physical access to the filtering device. However, many of the techniques in the Bypassing filters section still work.

Content labeling

edit

Content labeling may be considered another form of content-control software. In 1994, the Internet Content Rating Association (ICRA) — now part of the Family Online Safety Institute — developed a content rating system for online content providers. Using an online questionnaire a webmaster describes the nature of their web content. A small file is generated that contains a condensed, computer readable digest of this description that can then be used by content filtering software to block or allow that site.

ICRA labels come in a variety of formats.[19] These include the World Wide Web Consortium's Resource Description Framework (RDF) as well as Platform for Internet Content Selection (PICS) labels used by Microsoft's Internet Explorer Content Advisor.[20]

ICRA labels are an example of self-labeling. Similarly, in 2006 the Association of Sites Advocating Child Protection (ASACP) initiated the Restricted to Adults self-labeling initiative. ASACP members were concerned that various forms of legislation being proposed in the United States were going to have the effect of forcing adult companies to label their content.[21] The RTA label, unlike ICRA labels, does not require a webmaster to fill out a questionnaire or sign up to use. Like ICRA the RTA label is free. Both labels are recognized by a wide variety of content-control software.

The Voluntary Content Rating (VCR) system was devised by Solid Oak Software for their CYBERsitter filtering software, as an alternative to the PICS system, which some critics deemed too complex. It employs HTML metadata tags embedded within web page documents to specify the type of content contained in the document. Only two levels are specified, mature and adult, making the specification extremely simple.

By country

edit

Australia

edit

The Australian Internet Safety Advisory Body has information about "practical advice on Internet safety, parental control and filters for the protection of children, students and families" that also includes public libraries.[22]

NetAlert, the software made available free of charge by the Australian government, was allegedly cracked by a 16-year-old student, Tom Wood, less than a week after its release in August 2007. Wood supposedly bypassed the $84 million filter in about half an hour to highlight problems with the government's approach to Internet content filtering.[23]

The Australian Government has introduced legislation that requires ISPs to "restrict access to age restricted content (commercial MA15+ content and R18+ content) either hosted in Australia or provided from Australia" that was due to commence from 20 January 2008, known as Cleanfeed.[24]

Cleanfeed is a proposed mandatory ISP level content filtration system. It was proposed by the Beazley led Australian Labor Party opposition in a 2006 press release, with the intention of protecting children who were vulnerable due to claimed parental computer illiteracy. It was announced on 31 December 2007 as a policy to be implemented by the Rudd ALP government, and initial tests in Tasmania have produced a 2008 report. Cleanfeed is funded in the current budget, and is moving towards an Expression of Interest for live testing with ISPs in 2008. Public opposition and criticism have emerged, led by the EFA and gaining irregular mainstream media attention, with a majority of Australians reportedly "strongly against" its implementation.[25] Criticisms include its expense, inaccuracy (it will be impossible to ensure only illegal sites are blocked) and the fact that it will be compulsory, which can be seen as an intrusion on free speech rights.[25] Another major criticism point has been that although the filter is claimed to stop certain materials, the underground rings dealing in such materials will not be affected. The filter might also provide a false sense of security for parents, who might supervise children less while using the Internet, achieving the exact opposite effect.[original research?] Cleanfeed is a responsibility of Senator Conroy's portfolio.

Denmark

edit

In Denmark it is stated policy that it will "prevent inappropriate Internet sites from being accessed from children's libraries across Denmark".[26] "'It is important that every library in the country has the opportunity to protect children against pornographic material when they are using library computers. It is a main priority for me as Culture Minister to make sure children can surf the net safely at libraries,' states Brian Mikkelsen in a press-release of the Danish Ministry of Culture."[27]

United Kingdom

edit

Many libraries in the UK such as the British Library[28] and local authority public libraries[29] apply filters to Internet access. According to research conducted by the Radical Librarians Collective, at least 98% of public libraries apply filters; including categories such as "LGBT interest", "abortion" and "questionable".[30] Some public libraries block Payday loan websites[31]

United States

edit

The use of Internet filters or content-control software varies widely in public libraries in the United States, since Internet use policies are established by the local library board. Many libraries adopted Internet filters after Congress conditioned the receipt of universal service discounts on the use of Internet filters through the Children's Internet Protection Act (CIPA). Other libraries do not install content control software, believing that acceptable use policies and educational efforts address the issue of children accessing age-inappropriate content while preserving adult users' right to freely access information. Some libraries use Internet filters on computers used by children only. Some libraries that employ content-control software allow the software to be deactivated on a case-by-case basis on application to a librarian; libraries that are subject to CIPA are required to have a policy that allows adults to request that the filter be disabled without having to explain the reason for their request.

Many legal scholars believe that a number of legal cases, in particular Reno v. American Civil Liberties Union, established that the use of content-control software in libraries is a violation of the First Amendment.[32] The Children's Internet Protection Act [CIPA] and the June 2003 case United States v. American Library Association found CIPA constitutional as a condition placed on the receipt of federal funding, stating that First Amendment concerns were dispelled by the law's provision that allowed adult library users to have the filtering software disabled, without having to explain the reasons for their request. The plurality decision left open a future "as-applied" Constitutional challenge, however.

In November 2006, a lawsuit was filed against the North Central Regional Library District (NCRL) in Washington State for its policy of refusing to disable restrictions upon requests of adult patrons, but CIPA was not challenged in that matter.[33] In May 2010, the Washington State Supreme Court provided an opinion after it was asked to certify a question referred by the United States District Court for the Eastern District of Washington: "Whether a public library, consistent with Article I, § 5 of the Washington Constitution, may filter Internet access for all patrons without disabling Web sites containing constitutionally-protected speech upon the request of an adult library patron." The Washington State Supreme Court ruled that NCRL's internet filtering policy did not violate Article I, Section 5 of the Washington State Constitution. The Court said: "It appears to us that NCRL's filtering policy is reasonable and accords with its mission and these policies and is viewpoint neutral. It appears that no article I, section 5 content-based violation exists in this case. NCRL's essential mission is to promote reading and lifelong learning. As NCRL maintains, it is reasonable to impose restrictions on Internet access in order to maintain an environment that is conducive to study and contemplative thought." The case returned to federal court.

In March 2007, Virginia passed a law similar to CIPA that requires public libraries receiving state funds to use content-control software. Like CIPA, the law requires libraries to disable filters for an adult library user when requested to do so by the user.[34]

Criticism

edit

Filtering errors

edit

Overblocking

edit

Utilizing a filter that is overly zealous at filtering content, or mislabels content not intended to be censored can result in over-blocking, or over-censoring. Overblocking can filter out material that should be acceptable under the filtering policy in effect, for example health related information may unintentionally be filtered along with porn-related material because of the Scunthorpe problem. Filter administrators may prefer to err on the side of caution by accepting over blocking to prevent any risk of access to sites that they determine to be undesirable. Content-control software was mentioned as blocking access to Beaver College before its name change to Arcadia University.[35] Another example was the filtering of Horniman Museum.[36] As well, over-blocking may encourage users to bypass the filter entirely.

Underblocking

edit

Whenever new information is uploaded to the Internet, filters can under block, or under-censor, content if the parties responsible for maintaining the filters do not update them quickly and accurately, and a blacklisting rather than a whitelisting filtering policy is in place.[37]

Morality and opinion

edit

Many[38] would not be satisfied with government filtering viewpoints on moral or political issues, agreeing that this could become support for propaganda. Many[39] would also find it unacceptable that an ISP, whether by law or by the ISP's own choice, should deploy such software without allowing the users to disable the filtering for their own connections. In the United States, the First Amendment to the United States Constitution has been cited in calls to criminalise forced internet censorship. (See section below)

Religious, anti-religious, and political censorship

edit

Many types of content-control software have been shown to block sites based on the religious and political leanings of the company owners. Examples include blocking several religious sites[40][41] (including the Web site of the Vatican), many political sites, and homosexuality-related sites.[42] X-Stop was shown to block sites such as the Quaker web site, the National Journal of Sexual Orientation Law, The Heritage Foundation, and parts of The Ethical Spectacle.[43] CYBERsitter blocks out sites like National Organization for Women.[44] Nancy Willard, an academic researcher and attorney, pointed out that many U.S. public schools and libraries use the same filtering software that many Christian organizations use.[45] Cyber Patrol, a product developed by The Anti-Defamation League and Mattel's The Learning Company,[46] has been found to block not only political sites it deems to be engaging in 'hate speech' but also human rights web sites, such as Amnesty International's web page about Israel and gay-rights web sites, such as glaad.org.[47]

edit

In 1998, a United States federal district court in Virginia ruled (Loudoun v. Board of Trustees of the Loudoun County Library) that the imposition of mandatory filtering in a public library violates the First Amendment.[48]

In 1996 the US Congress passed the Communications Decency Act, banning indecency on the Internet. Civil liberties groups challenged the law under the First Amendment, and in 1997 the Supreme Court ruled in their favor.[49] Part of the civil liberties argument, especially from groups like the Electronic Frontier Foundation,[50] was that parents who wanted to block sites could use their own content-filtering software, making government involvement unnecessary.[51]

In the late 1990s, groups such as the Censorware Project began reverse-engineering the content-control software and decrypting the blacklists to determine what kind of sites the software blocked. This led to legal action alleging violation of the "Cyber Patrol" license agreement.[52] They discovered that such tools routinely blocked unobjectionable sites while also failing to block intended targets.

Some content-control software companies responded by claiming that their filtering criteria were backed by intensive manual checking. The companies' opponents argued, on the other hand, that performing the necessary checking would require resources greater than the companies possessed and that therefore their claims were not valid.[53]

The Motion Picture Association successfully obtained a UK ruling enforcing ISPs to use content-control software to prevent copyright infringement by their subscribers.[54]

Bypassing filters

edit

Content filtering in general can "be bypassed entirely by tech-savvy individuals." Blocking content on a device "[will not]…guarantee that users won't eventually be able to find a way around the filter."[55] Content providers may change URLs or IP addresses to circumvent filtering. Individuals with technical expertise may use a different method by employing multiple domains or URLs that direct to a shared IP address where restricted content is present. This strategy doesn't circumvent IP packet filtering, however can evade DNS poisoning and web proxies. Additionally, perpetrators may use mirrored websites that avoid filters.[56]

Some software may be bypassed successfully by using alternative protocols such as FTP or telnet or HTTPS, conducting searches in a different language, using a proxy server or a circumventor such as Psiphon. Also cached web pages returned by Google or other searches could bypass some controls as well. Web syndication services may provide alternate paths for content. Some of the more poorly designed programs can be shut down by killing their processes: for example, in Microsoft Windows through the Windows Task Manager, or in Mac OS X using Force Quit or Activity Monitor. Numerous workarounds and counters to workarounds from content-control software creators exist. Google services are often blocked by filters, but these may most often be bypassed by using http:// in place of http:// since content filtering software is not able to interpret content under secure connections (in this case SSL).[needs update]

An encrypted VPN can be used as means of bypassing content control software, especially if the content control software is installed on an Internet gateway or firewall. Other ways to bypass a content control filter include translation sites and establishing a remote connection with an uncensored device.[57]

See also

edit

References

edit
  1. ^ "Young, angry … and wired - May 3, 2005". Edition.CNN.com. 3 May 2005. Archived from the original on 8 December 2009. Retrieved 25 October 2009.
  2. ^ Umstead, R. Thomas (20 May 2006). "Playboy Preaches Control". Multichannel News. Archived from the original on 22 September 2013. Retrieved 25 June 2013.
  3. ^ Woolls, Daniel (October 25, 2002). "Web sites go blank to protest strict new Internet law". sfgate.com. Associated Press. Archived from the original on 8 July 2003.
  4. ^ Bickerton, Derek (30 November 1997). "Digital Dreams". The New York Times. Retrieved 25 October 2009.
  5. ^ "IT Glossary: Secure Web Gateway". gartner.com. Retrieved 27 March 2012.
  6. ^ "Censorware Project". censorware.net. Archived from the original on 20 June 2015.
  7. ^ "159.54.226.83/apps/pbcs.dll/article?AID=/20060319/COLUMN0203/603190309/1064". Archived from the original on 19 October 2007.
  8. ^ "DMCA 1201 Exemption Transcript, April 11 - Censorware". Sethf.com. 11 April 2003. Retrieved 25 October 2009.
  9. ^ "Windows Live to get censorware - ZDNet.co.uk". news.ZDNet.co.uk. 14 March 2006. Archived from the original on 5 December 2008. Retrieved 25 October 2009.
  10. ^ Client-side filters. NetSafeKids. National Academy of Sciences. 2003. ISBN 9780309082747. Retrieved 24 June 2013.
  11. ^ "Protecting Your Kids with Family Safety". microsoft.com. Retrieved 10 July 2012.
  12. ^ Xu, Xueyang; Mao, Z. Morley; Halderman, J. Alex (5 Jan 2011). "Internet Censorship in China: Where Does the Filtering Occur?" (PDF). Georgia Tech. University of Michigan. Archived from the original (PDF) on 24 March 2012. Retrieved 10 July 2012.
  13. ^ Christopher Williams (3 May 2012). "The Pirate Bay cut off from millions of Virgin Media customers". The Daily Telegraph. Retrieved 8 May 2012.
  14. ^ "Explicit and Transparent Proxy Deployments". websense.com. 2010. Archived from the original on 18 April 2012. Retrieved 30 March 2012.
  15. ^ Fruhlinger, Keith Shaw and Josh (2025-08-06). "What is DNS and how does it work?". Network World. Retrieved 2025-08-06.
  16. ^ Filtering. NetSafeKids. National Academy of Sciences. 2003. ISBN 9780309082747. Retrieved 22 November 2010.
  17. ^ "Accountability Software: Accountability and Monitoring Software Reviews". UrbanMinistry.org. TechMission, Safe Families. Retrieved 25 October 2009.
  18. ^ Varadharajan, Vijay (2010). "Internet filtering - Issues and challenges". IEEE Security & Privacy. 8 (4): 62–65. doi:10.1109/MSP.2010.131. Retrieved 2025-08-06.
  19. ^ "ICRA: Technical standards used". Family Online Safety Institute. Retrieved 2025-08-06.
  20. ^ "Browse the Web with Internet Explorer 6 and Content Advisor". microsoft.com. March 26, 2003.
  21. ^ "ASACP Participates in Financial Coalition Against Child Pornography". November 20, 2007. Retrieved 2025-08-06.
  22. ^ "NetAlert: Parents Guide to Internet Safety" (PDF). Australian Communications and Media Authority. 2 August 2007. Archived from the original (PDF) on 19 April 2013. Retrieved 24 June 2013.
  23. ^ "Teenager cracks govt's $84m porn filter". the Sydney Morning Herald. Fairfax Digital. Australian Associated Press (AAP). 25 August 2007. Retrieved 24 June 2013.
  24. ^ "Restricted Access Systems Declaration 2007" (PDF). Australian Communications and Media Authority. 2007. Archived from the original (PDF) on 24 March 2012. Retrieved 24 June 2013.
  25. ^ a b "Learn - No Clean Feed - Stop Internet Censorship in Australia". Electronic Frontiers Australia. Archived from the original on 7 January 2010. Retrieved 25 October 2009.
  26. ^ "Danish Ministry of Culture Chooses SonicWALL CMS 2100 Content Filter to Keep Children's Libraries Free of Unacceptable Material". PR Newswire.com (Press release). Retrieved 2025-08-06.
  27. ^ "Danish Minister of Culture offers Internet filters to libraries". saferinternet.org. Archived from the original on 2025-08-06. Retrieved 2025-08-06.
  28. ^ "British Library's wi-fi service blocks 'violent' Hamlet". BBC News. 13 August 2013.
  29. ^ "Do we want a perfectly filtered world?", Louise Cooke, Lecturer, Department of Information Science, Loughborough University, November 2006. Archived 4 December 2013 at the Wayback Machine
  30. ^ "New research maps the extent of web filtering in public libraries". 11 April 2016. Retrieved 18 July 2016.
  31. ^ Short, Adrian (3 April 2014). "Should public libraries block payday loan websites?". Pirate Party UK. Archived from the original on 11 September 2016. Retrieved 16 April 2014.
  32. ^ Wallace, Jonathan D. (November 9, 1997). "Purchase of blocking software by public libraries is unconstitutional".
  33. ^ "ACLU Suit Seeks Access to Information on Internet for Library Patrons". ACLU of Washington. November 16, 2006. Archived from the original on December 5, 2006.
  34. ^ Sluss, Michael (March 23, 2007). "Kaine signs library bill: The legislation requires public libraries to block obscene material with Internet filters". The Roanoke Times. Archived from the original on February 29, 2012. Retrieved March 24, 2007.
  35. ^ "Web Censors Prompt College To Consider Name Change". slashdot.org. 2 March 2000. Retrieved 22 November 2010.
  36. ^ Lester Haines (8 October 2004). "Porn filters have a field day on Horniman Museum". The Register.
  37. ^ Stark, Philip B. (10 November 2007). "The Effectiveness of Internet Content Filters" (PDF). University of California, Berkeley. Archived (PDF) from the original on 2025-08-06. Retrieved 22 November 2010.
  38. ^ Lui, Spandas (23 March 2010). "Microsoft, Google and Yahoo! speak out in ISP filter consultation". AARNet.com. Retrieved 22 November 2010.
  39. ^ "Google and Yahoo raise doubts over planned net filters". BBC News. 16 February 2010. Retrieved 30 April 2010.
  40. ^ Kelly Wilson (2025-08-06). "Hometown Has Been Shutdown - People Connection Blog: AIM Community Network". AOL Hometown. Archived from the original on 2025-08-06. Retrieved 2025-08-06.
  41. ^ "Notice!!". Members.tripod.com. Retrieved 2025-08-06.
  42. ^ "www.glaad.org/media/archive_detail.php?id=103&". Archived from the original on June 7, 2008.
  43. ^ "The Mind of a Censor". Spectacle.org. Retrieved 2025-08-06.
  44. ^ "CYBERsitter: Where do we not want you to go today?". Spectacle.org. Retrieved 2025-08-06.
  45. ^ "See: Filtering Software: The Religious Connection". Csriu.org. Archived from the original on 2025-08-06. Retrieved 2025-08-06.
  46. ^ "See: ADL and The Learning Company Develop Educational Software". Anti-Defamation League. Archived from the original on 2025-08-06. Retrieved 2025-08-06.
  47. ^ "See: Cyber Patrol Examined". peacefire.org. Retrieved 2025-08-06.
  48. ^ "Mainstream Loudon v. Board of Trustees of the Loudon County Library, 24 F. Supp. 2d 552 (E.D. Va. 1998)". Tomwbell.com. Retrieved 25 October 2009.
  49. ^ "Reno v. American Civil Liberties Union - 521 U.S. 844 (1997)". Justia.com. 26 June 1997.
  50. ^ "Legal Victories". Electronic Frontier Foundation. Retrieved 2025-08-06.
  51. ^ "Children Internet Safety". www.justice.gov. 2025-08-06. Archived from the original on 2025-08-06. Retrieved 2025-08-06.
  52. ^ "Microsystems v Scandinavia Online, Verified Complaint". Electronic Frontier Foundation. United States District Court, District of Massachusetts. 15 March 2000. Archived from the original on 12 February 2009. Retrieved 25 October 2009.
  53. ^ Seth Finkelstein & Lee Tien. "Electronic Frontier Foundation White Paper 1 for NRC project on Tools and Strategies for Protecting Kids from Pornography and Their Applicability to Other Inappropriate Internet Content". National Academy of Sciences, Engineering, and Medicine. Archived from the original on 19 April 2006.
  54. ^ "Sky, Virgin Media Asked to Block Piracy Site Newzbin2". BBC News. 9 November 2011. Retrieved 26 March 2012.
  55. ^ Satterfield, Brian (4 June 2007). "Understanding Content Filtering: An FAQ for Nonprofits". TechSoup.org. Retrieved 24 June 2013.
  56. ^ Varadharajan, Vijay (July 2010). "Internet filtering - Issues and challenges". IEEE Security & Privacy. 8 (4): 62–65. doi:10.1109/msp.2010.131. ISSN 1540-7993.
  57. ^ "Is It Possible To Easily Avoid Internet Filters?". Comodo Cybersecurity. 4 June 2007. Retrieved 2 October 2018.
生物指的是什么 肝占位病变是什么意思 脱节是什么意思 彩虹像什么挂在天空 咖啡对身体有什么危害
帕金森病是什么病 梦见亲人死了是什么意思 座山雕什么意思 吃什么可以增加黄体酮 右侧卵巢内囊性结构什么意思
1966年属什么 胎儿腹围偏大说明什么 四月四号什么星座 下巴长痘痘什么原因 血型b型rh阳性是什么意思
恏是什么意思 高血压吃什么可以降下来 什么是捞女 寻麻疹看什么科 五行属火适合什么行业
雌堕什么意思hcv8jop7ns5r.cn 嘴角烂是什么原因hcv9jop0ns1r.cn 强直性脊柱炎什么症状hcv7jop6ns8r.cn 什么是逆向思维hcv8jop5ns4r.cn 一个歹一个殇读什么hcv8jop4ns2r.cn
双十一是什么节日0297y7.com 喉咙痛可以吃什么水果wzqsfys.com 竹字五行属什么hcv8jop7ns6r.cn 什么是备孕hcv9jop6ns6r.cn 端午节晚上吃什么hcv9jop0ns6r.cn
什么是引流hcv8jop7ns7r.cn 胃胆汁反流是什么原因引起的hcv9jop5ns8r.cn 乙肝抗体是什么意思hcv8jop6ns3r.cn 钾低会出现什么症状hcv9jop1ns1r.cn 中巴友谊为什么这么好hcv9jop6ns4r.cn
舌头尖有小红点这是什么症状1949doufunao.com 血糖的单位是什么hcv9jop1ns5r.cn hpv都有什么症状hcv9jop1ns6r.cn 岔气是什么意思aiwuzhiyu.com 乙肝阻断针什么时候打weuuu.com
百度