Time Left - 12:00 mins

LIC AAO/SBI PO Pre Mini Mock-11

Attempt now to get your rank among 1473 students!

Question 1

Direction: Read the given passage carefully and answer the questions that follow.

A recent article in The Guardian estimated that the sex tech industry, which is less than a decade old, is already worth $30 billion. This estimate is expected to grow exponentially as the industry gears up to unveil hyper-realistic female sex robots customised for men. This has two main implications: first, considerable money, time and effort are dedicated to modelling machine behaviour to cater to male preferences by objectifying the female form. Second, the technology needed to drive these innovations is designed in most cases by male coders.
The gender equation is reinforced in another manner. While lines of code are written by men, artificial intelligence (AI) is often female. The fact that Siri, Alexa, Amelia, Amy and Cortana are all designed as hyper-intelligent yet servile female chatbots is not coincidental. On the other hand, women’s participation in certain media fora is highly under-represented. As coders and consumers of technology are largely male, they are crafting algorithms that absorb existing gender and racial prejudices. AI is replicating the same conceptions of gender roles that are being removed in the real world. For instance, Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa are essentially modelled after efficient and subservient secretaries. This seemingly innocuous assignment of female characteristics to AI personalities has dangerous implications. These chatbots reportedly receive sexually-charged messages on a regular basis. More damaging still is the fact that they are programmed to respond deferentially or even play along with such suggestions.
Voices of disembodied, supportive AI tend to be female as both men and women find them less threatening. This comfort in issuing orders to a female voice is inherently problematic that tech companies have now acknowledged. Not only are companies investing in developing male bots and genderless bots, reportedly when someone asks Cortana, “Are you a girl?” she replies, “No. But I’m awesome like a girl.” Similarly, Alexa has been described as a ‘self-identified feminist’. While feminist female chatbots are encouraging, they can hardly solve the inbuilt sexism by design of AI. As AI grows in influence and gender biases continue seeping through algorithms, existing inequalities will be exacerbated. In India, for instance, the legal sector is gradually embracing AI, which is expected to improve speed and efficiency by automating tasks such as document drafting, undertaking legal research and due diligence. Similarly, news-writing bots are now functioning in the world of journalism. In both cases, AI will autonomously generate output by identifying story angles based on algorithms with ‘built-in’ criteria. When cases involving sexual violence and their portrayal in traditional news media are already under scrutiny, it’s important to question how male-hegemonic data sets will impact future news stories and court coverage of sexual assault and other topics requiring greater gender sensitivity. Since only 29% of internet users and 28% of mobile phone owners in India are women, improving access to basic information and communication technology services and infrastructure remains critical. There is nothing inherently empowering or sexist about technology. It just reflects the values of its creators.

Source: https://blogs.economictimes.indiatimes.com
Why is artificial intelligence (AI) often females?

I. To quickly adapt to the male-dominated ecosystem and cater to male preferences
II. Because these machines tend to perform jobs that have traditionally been associated with women
III. Because men find women attractive, and women are also OK dealing with women
IV. Because the technology needed to drive these innovations is designed in most cases by male coders.
V. Both male and female find it less threatening.

Question 2

Direction: Read the given passage carefully and answer the questions that follow.

A recent article in The Guardian estimated that the sex tech industry, which is less than a decade old, is already worth $30 billion. This estimate is expected to grow exponentially as the industry gears up to unveil hyper-realistic female sex robots customised for men. This has two main implications: first, considerable money, time and effort are dedicated to modelling machine behaviour to cater to male preferences by objectifying the female form. Second, the technology needed to drive these innovations is designed in most cases by male coders.
The gender equation is reinforced in another manner. While lines of code are written by men, artificial intelligence (AI) is often female. The fact that Siri, Alexa, Amelia, Amy and Cortana are all designed as hyper-intelligent yet servile female chatbots is not coincidental. On the other hand, women’s participation in certain media fora is highly under-represented. As coders and consumers of technology are largely male, they are crafting algorithms that absorb existing gender and racial prejudices. AI is replicating the same conceptions of gender roles that are being removed in the real world. For instance, Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa are essentially modelled after efficient and subservient secretaries. This seemingly innocuous assignment of female characteristics to AI personalities has dangerous implications. These chatbots reportedly receive sexually-charged messages on a regular basis. More damaging still is the fact that they are programmed to respond deferentially or even play along with such suggestions.
Voices of disembodied, supportive AI tend to be female as both men and women find them less threatening. This comfort in issuing orders to a female voice is inherently problematic that tech companies have now acknowledged. Not only are companies investing in developing male bots and genderless bots, reportedly when someone asks Cortana, “Are you a girl?” she replies, “No. But I’m awesome like a girl.” Similarly, Alexa has been described as a ‘self-identified feminist’. While feminist female chatbots are encouraging, they can hardly solve the inbuilt sexism by design of AI. As AI grows in influence and gender biases continue seeping through algorithms, existing inequalities will be exacerbated. In India, for instance, the legal sector is gradually embracing AI, which is expected to improve speed and efficiency by automating tasks such as document drafting, undertaking legal research and due diligence. Similarly, news-writing bots are now functioning in the world of journalism. In both cases, AI will autonomously generate output by identifying story angles based on algorithms with ‘built-in’ criteria. When cases involving sexual violence and their portrayal in traditional news media are already under scrutiny, it’s important to question how male-hegemonic data sets will impact future news stories and court coverage of sexual assault and other topics requiring greater gender sensitivity. Since only 29% of internet users and 28% of mobile phone owners in India are women, improving access to basic information and communication technology services and infrastructure remains critical. There is nothing inherently empowering or sexist about technology. It just reflects the values of its creators.

Source: https://blogs.economictimes.indiatimes.com
What are the negative associations related to the assignment of female persona to AI?

Question 3

Direction: Read the given passage carefully and answer the questions that follow.

A recent article in The Guardian estimated that the sex tech industry, which is less than a decade old, is already worth $30 billion. This estimate is expected to grow exponentially as the industry gears up to unveil hyper-realistic female sex robots customised for men. This has two main implications: first, considerable money, time and effort are dedicated to modelling machine behaviour to cater to male preferences by objectifying the female form. Second, the technology needed to drive these innovations is designed in most cases by male coders.
The gender equation is reinforced in another manner. While lines of code are written by men, artificial intelligence (AI) is often female. The fact that Siri, Alexa, Amelia, Amy and Cortana are all designed as hyper-intelligent yet servile female chatbots is not coincidental. On the other hand, women’s participation in certain media fora is highly under-represented. As coders and consumers of technology are largely male, they are crafting algorithms that absorb existing gender and racial prejudices. AI is replicating the same conceptions of gender roles that are being removed in the real world. For instance, Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa are essentially modelled after efficient and subservient secretaries. This seemingly innocuous assignment of female characteristics to AI personalities has dangerous implications. These chatbots reportedly receive sexually-charged messages on a regular basis. More damaging still is the fact that they are programmed to respond deferentially or even play along with such suggestions.
Voices of disembodied, supportive AI tend to be female as both men and women find them less threatening. This comfort in issuing orders to a female voice is inherently problematic that tech companies have now acknowledged. Not only are companies investing in developing male bots and genderless bots, reportedly when someone asks Cortana, “Are you a girl?” she replies, “No. But I’m awesome like a girl.” Similarly, Alexa has been described as a ‘self-identified feminist’. While feminist female chatbots are encouraging, they can hardly solve the inbuilt sexism by design of AI. As AI grows in influence and gender biases continue seeping through algorithms, existing inequalities will be exacerbated. In India, for instance, the legal sector is gradually embracing AI, which is expected to improve speed and efficiency by automating tasks such as document drafting, undertaking legal research and due diligence. Similarly, news-writing bots are now functioning in the world of journalism. In both cases, AI will autonomously generate output by identifying story angles based on algorithms with ‘built-in’ criteria. When cases involving sexual violence and their portrayal in traditional news media are already under scrutiny, it’s important to question how male-hegemonic data sets will impact future news stories and court coverage of sexual assault and other topics requiring greater gender sensitivity. Since only 29% of internet users and 28% of mobile phone owners in India are women, improving access to basic information and communication technology services and infrastructure remains critical. There is nothing inherently empowering or sexist about technology. It just reflects the values of its creators.

Source: https://blogs.economictimes.indiatimes.com
Which of these sectors currently embodies AI?
I. The media sector that uses news writing bots as AI will autonomously generate output by identifying story angles based on algorithms with ‘built-in’ criteria.
II. The financial service sectors who use it for mainstream business decision-making by analysing glean insights from large amounts of unstructured data.
III. The legal sector which expected to improve speed and efficiency by automating tasks such as document drafting, undertaking legal research and due diligence.

Question 4

Direction: Read the given passage carefully and answer the questions that follow.

A recent article in The Guardian estimated that the sex tech industry, which is less than a decade old, is already worth $30 billion. This estimate is expected to grow exponentially as the industry gears up to unveil hyper-realistic female sex robots customised for men. This has two main implications: first, considerable money, time and effort are dedicated to modelling machine behaviour to cater to male preferences by objectifying the female form. Second, the technology needed to drive these innovations is designed in most cases by male coders.
The gender equation is reinforced in another manner. While lines of code are written by men, artificial intelligence (AI) is often female. The fact that Siri, Alexa, Amelia, Amy and Cortana are all designed as hyper-intelligent yet servile female chatbots is not coincidental. On the other hand, women’s participation in certain media fora is highly under-represented. As coders and consumers of technology are largely male, they are crafting algorithms that absorb existing gender and racial prejudices. AI is replicating the same conceptions of gender roles that are being removed in the real world. For instance, Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa are essentially modelled after efficient and subservient secretaries. This seemingly innocuous assignment of female characteristics to AI personalities has dangerous implications. These chatbots reportedly receive sexually-charged messages on a regular basis. More damaging still is the fact that they are programmed to respond deferentially or even play along with such suggestions.
Voices of disembodied, supportive AI tend to be female as both men and women find them less threatening. This comfort in issuing orders to a female voice is inherently problematic that tech companies have now acknowledged. Not only are companies investing in developing male bots and genderless bots, reportedly when someone asks Cortana, “Are you a girl?” she replies, “No. But I’m awesome like a girl.” Similarly, Alexa has been described as a ‘self-identified feminist’. While feminist female chatbots are encouraging, they can hardly solve the inbuilt sexism by design of AI. As AI grows in influence and gender biases continue seeping through algorithms, existing inequalities will be exacerbated. In India, for instance, the legal sector is gradually embracing AI, which is expected to improve speed and efficiency by automating tasks such as document drafting, undertaking legal research and due diligence. Similarly, news-writing bots are now functioning in the world of journalism. In both cases, AI will autonomously generate output by identifying story angles based on algorithms with ‘built-in’ criteria. When cases involving sexual violence and their portrayal in traditional news media are already under scrutiny, it’s important to question how male-hegemonic data sets will impact future news stories and court coverage of sexual assault and other topics requiring greater gender sensitivity. Since only 29% of internet users and 28% of mobile phone owners in India are women, improving access to basic information and communication technology services and infrastructure remains critical. There is nothing inherently empowering or sexist about technology. It just reflects the values of its creators.

Source: https://blogs.economictimes.indiatimes.com
Which of the statement is true with reference to the passage?

Question 5

Direction: Read the given passage carefully and answer the questions that follow.
A recent article in The Guardian estimated that the sex tech industry, which is less than a decade old, is already worth $30 billion. This estimate is expected to grow exponentially as the industry gears up to unveil hyper-realistic female sex robots customised for men. This has two main implications: first, considerable money, time and effort are dedicated to modelling machine behaviour to cater to male preferences by objectifying the female form. Second, the technology needed to drive these innovations is designed in most cases by male coders.
The gender equation is reinforced in another manner. While lines of code are written by men, artificial intelligence (AI) is often female. The fact that Siri, Alexa, Amelia, Amy and Cortana are all designed as hyper-intelligent yet servile female chatbots is not coincidental. On the other hand, women’s participation in certain media fora is highly under-represented. As coders and consumers of technology are largely male, they are crafting algorithms that absorb existing gender and racial prejudices. AI is replicating the same conceptions of gender roles that are being removed in the real world. For instance, Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa are essentially modelled after efficient and subservient secretaries. This seemingly innocuous assignment of female characteristics to AI personalities has dangerous implications. These chatbots reportedly receive sexually-charged messages on a regular basis. More damaging still is the fact that they are programmed to respond deferentially or even play along with such suggestions.
Voices of disembodied, supportive AI tend to be female as both men and women find them less threatening. This comfort in issuing orders to a female voice is inherently problematic that tech companies have now acknowledged. Not only are companies investing in developing male bots and genderless bots, reportedly when someone asks Cortana, “Are you a girl?” she replies, “No. But I’m awesome like a girl.” Similarly, Alexa has been described as a ‘self-identified feminist’. While feminist female chatbots are encouraging, they can hardly solve the inbuilt sexism by design of AI. As AI grows in influence and gender biases continue seeping through algorithms, existing inequalities will be exacerbated. In India, for instance, the legal sector is gradually embracing AI, which is expected to improve speed and efficiency by automating tasks such as document drafting, undertaking legal research and due diligence. Similarly, news-writing bots are now functioning in the world of journalism. In both cases, AI will autonomously generate output by identifying story angles based on algorithms with ‘built-in’ criteria. When cases involving sexual violence and their portrayal in traditional news media are already under scrutiny, it’s important to question how male-hegemonic data sets will impact future news stories and court coverage of sexual assault and other topics requiring greater gender sensitivity. Since only 29% of internet users and 28% of mobile phone owners in India are women, improving access to basic information and communication technology services and infrastructure remains critical. There is nothing inherently empowering or sexist about technology. It just reflects the values of its creators.
Source: https://blogs.economictimes.indiatimes.com
According to the context of the passage, which of the following option is untrue?

Question 6

Direction: In the following number series only one number is wrong. Find out the wrong number.

64,66, 76, 106, 180, 304

Question 7

Direction: In the following number series, only one number is wrong. Find out the incorrect number.
32, 45, 60, 81, 104, 133

Question 8

Direction: Find the wrong term in the given series:
7, 4, 5, 9, 20, 51, 160.5

Question 9

Direction: In the following number series, only one number is wrong. Find out the incorrect number.
0, 3, 10, 20, 36, 55

Question 10

Direction: In the following number series only one number is wrong. Find out the wrong number.

22, 30, 46, 70, 104, 142

Question 11

Direction: In each question below, some statements are given followed by some conclusions. You have to take the given statements to be true even if they seem to be at variance with commonly known facts. Read all the conclusions and then decide which of the given conclusions logically follows/follow from the given statements, disregarding commonly known facts.

Statements:
No part is link.
Only a few straw are links.
All pots are links.
Conclusions:
I. Some straw are definitely not links.
II. All links being straw is not a possibility.

Question 12

Direction: In each question below, some statements are given followed by some conclusions. You have to take the given statements to be true even if they seem to be at variance with commonly known facts. Read all the conclusions and then decide which of the given conclusions logically follows/follow from the given statements, disregarding commonly known facts.
Statements:
No part is link.
Only a few straw are links.
All pots are links.
Conclusions:
I. All links being pots in not a possibility.
II. All parts are pots.
III. All pots can never be straw.

Question 13

Direction: In each question below, some statements are given followed by some conclusions. You have to take the given statements to be true even if they seem to be at variance with commonly known facts. Read all the conclusions and then decide which of the given conclusions logically follows/follow from the given statements, disregarding commonly known facts.
Statements:
All brown are breads.
A few breads are nice.
Only a few jam are nice.
Conclusions:
I. All nice being jam is not a possibility.
II. All jam are brown.

Question 14

Direction: In each question below, some statements are given followed by some conclusions. You have to take the given statements to be true even if they seem to be at variance with commonly known facts. Read all the conclusions and then decide which of the given conclusions logically follows/follow from the given statements, disregarding commonly known facts.
Statements:
All brown are breads.
A few breads are nice.
Only a few jam are nice.
Conclusions:
I. All jam can never be nice.
II. Some jam are breads.

Question 15

Direction: In the question below are given statements followed by some conclusions. You have to take the given statements to be true even if they seem to be at variance with commonly known facts. Read all the conclusions and then decide which of the given conclusions logically follows from the given statements disregarding commonly known facts.
Statement:
No physics is maths.
Some chemistry is maths.
All sciences are chemistry.
Conclusion:
I. No science is physics.
II. Some physics are science.
III. Some physics are chemistry.
  • 1473 attempts
  • 4 upvotes
  • 11 comments
Aug 22PO, Clerk, SO, Insurance