The New York City Council yesterday passed legislation seeking to address problems with Algorithms which can determine which school a child can attend, whether a person will be offered credit from a bank, what products are advertised to consumer, and whether someone will receive an interview for a job. Government officials also use them to predict where crimes will take place, who is likely to commit a crime and whether someone should be allowed out of jail on bail. The algorithms used in facial recognition technology, for example, have been shown to be less accurate on Black people, women, and juveniles.
The new bill seeking the signature of Mayor Bill de Blasio. States:
This bill would require the creation of a task force that provides recommendations on how information on agency automated decision systems may be shared with the public and how agencies may address instances where people are harmed by agency automated decision systems.
The task force would need to be formed within three months of the bill’s signing, and importantly it must include “persons with expertise in the areas of fairness, accountability and transparency relating to automated decision systems and persons affiliated with charitable corporations that represent persons in the city affected by agency automated decision systems.”
The New York division of the ACLU has argued in favor of it.
It appears that it is very easy to get a fake academic certificate from any major urban centers. The buyer of the fake certificate dont have to worry about the authenticity of the signatures or paper quality – all that is sorted out by the fraudsters.
The discussion is often about whether you want a PhD, master’s or bachelor’s degree. Additionally, the more prestigious the university you want to claim to have graduated from , the more money you will be required to pay for the fake certificate.
Once you have the fake paper in your hands, you can apply for prominent jobs, particularly in the public sector, where job security is so high that getting fired at a later stage is more complicated and costly.
Employers find it time-consuming to authenticate or verify that glimmering certificate from the purported universities for various reasons.
In developed economies, the data protection laws do not allow universities to disclose the private credentials of students to third parties – unless the students expressly and explicitly ask them to make the disclosure.
MIT and the University of Melbourne are pioneering this approach and solution to this problem. Blockchain technology-providing a decentralised ledger that is globally accessible, immutable, secure and with the support of anonymity. Universities can record student academic certificates into the global blockchain, allowing graduates to access their credentials from anywhere across the globe and share them with potential employers.
The list of occupations that will be decimated by artificial intelligence and automation is becoming larger and larger with drivers, translators and shop assistants under threat from the rise of the robots,.Now you can add lawyers to the list.
Both the humans and the AI were given the basic facts of hundreds of PPI (payment protection insurance) and asked to predict whether the Financial Ombudsman would allow a claim.
In all, they submitted 775 predictions and the computer won hands down, with Case Cruncher getting an accuracy rate of 86.6%, compared with 66.3% for the lawyers.
Case Cruncher is not the product of a tech giant but the brainchild of four Cambridge law students. They started out with a simple chatbot that answered legal questions – a bit of a gimmick but it caught on.
Two judges oversaw the competition, Cambridge law lecturer Felix Steffek and Ian Dodd from a company called Prediction, which runs one of the world’s biggest databases of legal cases. He says the youthful Case Cruncher team chose the subject for the contest well.
Ian Dodd thinks AI may replace some of the grunt work done by junior lawyers and paralegals but no machine can talk to a client or argue in front of a High Court judge. He puts it simply: “The knowledge jobs will go, the wisdom jobs will stay.”
Sir Venki Ramakrishnan says risks and benefits of germline therapy, which is banned in Britain, should be debated
An international team of scientists, led by researchers at the Oregon Health and Science University, has used genetic engineering on human sperm and a pre-embryo. The group says is doing basic research to figure out if new forms of genetic engineering might be able to prevent or repair terrible hereditary diseases. Congress has banned federal funding for genetic engineering of sperm, eggs, pre-embryos or embryos. That means everything goes on in the private or philanthropic world here or overseas, without much guidance. It should be determined who should own the techniques for genetic engineering. Important patent fights are underway among the technology’s inventors. Which means lots of money. is at stake. And that means it is time to talk about who gets to own what and charge what. Finally, human genetic engineering needs to be monitored closely: all experiments registered, all data reported on a public database and all outcomes — good and bad — made available to all scientists and anyone else tracking this area of research. Secrecy is the worst enemy that human genetic engineering could possibly have. Today we need to focus on who will own genetic engineering technology, how we can oversee what is being done with it and how safe it needs to be before it is used to try to prevent or fix a disease. Plenty to worry about.
Google and Viacom both faced a class-action lawsuit that claimed Nickelodeon’s Nick.com placed cookies on children under-13 computers and that Google used those cookies to work out which videos kids had watched and the games they played to dish out targeted ads.
It’s against US law to gather information from children under-13 personal without warning parents and getting their permission; it’s claimed that permission was never sought.
However, the appeals court said Google was not liable as even though it served up ads to kids, it did not collect their info directly: it was given the data by Viacom’s Nick.com servers.
Viacom was also largely let off, as the court said that the data gathered was deemed not specific enough to be personally identifying.
In absolving Google, the appeals court drew a parallel to case that inspired the Video Privacy Protection Act, the leaking of Supreme Court nominee Robert Bork’s video rental history. Just as the court ruled that the Washington Post was not liable for receiving and publishing Bork’s rental history from the video store, Google is not liable for receiving the IP address and browsing history Viacom’s cookies collected.
There’s unsurety as to what the Government is doing with the images. They say, Facial-recognition systems may indeed speed up the boarding process, however, the real reason they are cropping up in U.S. airports is that the government wants to keep better track of who is leaving the country, by scanning travelers’ faces and verifying those scans against photos it already has on file. The idea is that this will catch fake passports and make sure people aren’t overstaying their visas. The U.S. Department of Homeland Security has partnered with airlines including JetBlue and Delta to introduce such recognition systems at New York’s JFK International Airport, Washington’s Dulles International, and airports in Atlanta, Boston, and Houston, among others. It plans to add more this summer.
“As It Searches for Suspects, the FBI May Be Looking at You”). Privacy advocates also point out that research has shown the technology to be less accurate with older photos and with images of women, African-Americans, and children (see “Is Facial Recognition Accurate? Depends on Your Race”).
Western New York police department has purchased a drone for $9,994.99.
It will be flying the skies of West Seneca to help officers solve crimes and keep the community safe. The grant was secured by State Senator Patrick Gallivan.
West Seneca Police have been training for eight months on how to use this new technology, which officers say will assist in many different police missions including search and rescues, creek levels during flooding and crime scene analysis.
The drone is equipped to drop items to those in need, such as a during a hostage situation. They can put a cell phone in it for delivery to someone in need, during a hostage situation which will help our hostage negotiators maintain communication with them.The drone can travel up to 400 feet high, with a speed up to 50 miles per hour, with a rotating camera that captures video from all angles.
The drone also can give the investigators an indicator of where a fire started,” according to Lt. McNamara. “Accident investigation, that can be used to show the weather conditions at the time of an accident.”
The department is ready to start flying, but is waiting for final approval from the Federal Aviation Administration to use the drone at night.