Next Page: 100
          

Feminist autonomous infrastructures   

Cache   

Authored by: 

Organization: 

Media@McGill and Tactical Tech Collective
AttachmentSize
PDF icon gw2015-hache.pdf931.55 KB

Feminist autonomous infrastructures

Introduction

Women, feminists, and gay, lesbian, bisexual, trans*, queer and intersex (GLBTQI) individuals share common experiences online: they can easily become targets of online harassment, discrimination or censorship, be it by government, private actors or corporations. When trying to understand the relationship between gender, violence and technology, one should keep in mind that online violence is intrinsically linked with real-life situations. When bigotry, sexism and homophobic attitudes exist in societies, they will almost inevitably be amplified in the online world.

”Real” name policies, data mining, tracking and surveillance technologies have become so intertwined that the days when no one knew if you were a dog or a cyborg on the internet are largely over. In fact, the creation of an industry around the profiling of users, coupled with the centralisation and contraction of the internet, have led to a situation where it is not a safe space (if it ever was). In 1996 the Declaration of the Independence of Cyberspace announced the creation of “a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity.” 1 But nowadays it is too common to see the work and voices of women, feminists and GLBTQI being deleted, censored and/or prevented from being seen, heard or read. Much of this gender-based online violence happens on corporate social media platforms such as Facebook, Twitter, Reddit and the blogosphere, in addition to other non-profit online spaces such as Wikipedia. All of them involve large communities, which are led by a set of practices and policies. Despite the existence of certain rules that govern these spaces and because of certain practices, silencing, intimidation and/or discrimination continues. So far, responses from GLBTQI to violence have involved organised public shaming, doxxing of harassers, 2 feminist counter-speech, active research and documentation, awareness raising around privacy and security, advocacy for amendments to corporate terms of service, and lobbying of institutions contributing to the governance of the internet, among others. While these tactics are paramount to the embodiment of everyday forms of online resistance,3 there is also a need to think about adopting strategies that are not only reactive, but also project us into the future we want. In other words, it is about dreaming and pre-figuring our technologies actively. Proactive practices involve understanding what it means to take back the command and control of technologies by using, creating and maintaining our own ones and shaping our communication and technological infrastructures. Using corporate services such as Facebook or Twitter may be very convenient, and at times strategic because they are generally provided for free and because this is where the so-called critical masses are. But using them also means accepting their terms of service, which are primarily shaped by profit, and in which human rights and gender social justice still remain of negligible importance. When using these online services, we and our networks are at their mercy, which means we can not fully control our data, social networks and historical memories (or traces) on the internet. While the future of the internet often looks bleak, it is paramount to not only continue to investigate into the processes and governance structure of the internet, but to continue to build a communication and technological ecology that puts human well-being front and centre, rather than profit. What will happen when big data has its proper algorithms? What will be the combined relationships between these algorithms and the Facebook project internet.org or the “Internet of Things”, to name only two of the upcoming situations that will again redefine people's rights to privacy and free expression? When our data remains under corporate control, they can be sold or given to third parties to exploit, or they can be deleted or shut down. Ultimately, they become our digital shadows, 4 enabling others to track, profile and control our voices, opinions and expressions. Part of the answer lies in developing, supporting and using not-for-profit, independent, privacy-aware and secure alternatives to corporate online services. Collectives such as Riseup, Nadir and Autistici/Inventati have been powered by hacktivist collectives for almost two decades now.5 They have provided – through volunteer work and community contributions – valuable and basic online services such as email, mailing list servers, wikis, pads,6 blogs and virtual private networks (VPN)7 to activists all around the world. But where are the feminist tech collectives that design and maintain feminist autonomous infrastructures for feminists, queer and trans* people and activists at large? We also need to ask ourselves, why are those feminist tech collectives still so embryonic? And what does this tell us about the discrimination and violence happening when women and feminists do not control, own and manage the technological infrastructure they need to express themselves and act online?

Shaping autonomy within our technologies

One of the main constitutive elements of feminist autonomous infrastructures lies in the concept of self-organisation already practised by many social movements that understand the question of autonomy as a desire for freedom, self-valorisation and mutual aid. In addition, we understand the term technological infrastructure in an expansive way, encompassing hardware, software and applications, but also participatory design, safe spaces and social solidarities. Concrete examples of feminist autonomous infrastructures include the Geek Feminism Wiki, 8 developing specific technologies that tackle gender-based online violence, such as bots against trolls, and building feminist online libraries and feminist servers, but also enabling offline safe spaces such as feminist hackerspaces which allow feminist, queer and trans* hackers, makers and geeks to gather and learn with others.

When talking about these examples of feminist autonomous infrastructures, we recognise that none of them can be fully autonomous, but rather relative in their autonomy, as they still depend, for instance, on already existing communication networks and technologies designed by mainstream companies (such as computers, servers and access devices). Having said that, their autonomy is based on different governance models, the values they embrace and the principles they promote. If feminist autonomous infrastructures are diverse in scope and in shape, they do share in common a desire to proactively create the conditions for their autonomy while following an ethic of care9 which is embedded into the active practice of social solidarities. Caring and recognising the importance of such infrastructures are two aspects that are central to attempt to address the cycle of technology that is rife with inequality from the production of technology, to its access, uptake, development and governance, until its end cycle. This intersectional and integrated approach to technology goes hand in hand with a feminist posture that does not shy away from addressing all forms of violence, whether it be online violence or the violence that is intrinsic in resource extraction or the factory and assembly line work that is gendered and raced. 10

Recently, momentum has gathered around the building of feminist autonomous infrastructures. These initiatives are still in their embryonic stage, mainly representing a set of scattered and fragmented initiatives. Below we highlight two different examples – one addressing the need for physical safe spaces enabling women and feminists to gather and uplift their skills, and another addressing the slow-politics around the creation of feminist servers.

Breaking the circle of isolation by learning together

The Gender and Technology Institute11 was organised by the Tactical Technology Collective and the Association for Progressive Communications (APC) at the end of 2014. The event brought together almost 80 participants and facilitators, mostly from the global South, to focus on some of the issues faced daily by women and trans* persons on the internet, to share strategies and tools for better protecting our privacy and security online, and to discuss how to spread knowledge and skills in our communities and organisations. Since then, the network has expanded, with different outcomes ranging from the creation of a collaborative online space enabling the documentation of the activities around privacy and digital security delivered by its members on the ground, to the production of a manual specifically addressing gender-related issues which also offers various strategies and tools for taking control of our online identities and learning how to shape safe spaces.

All these outcomes are informed by the stories and creative practices of women and feminist grassroots activists, located in 22 different countries, who are actively and creatively using and making technology to tackle gender-based online violence. Meanwhile they become digital security trainers, and privacy advocates, and they are helping others to understand how they can adopt safer and more joyful practices when engaging online and offline.

Eight months after its realisation, the Gender and Technology Institute has become an international informal network of support, a friendly resource space based on social solidarities that helps to break the circle of isolation.12 This contributes to strengthening the technological autonomy of its participants and, by extension, women, feminists and GLBTQI individuals and organisations, in order to face the challenges and threats derivative of their use of the internet.

Feminist servers

A server can be defined as a computer connected to a network that provides services such as hosting files, websites and online services. Because all online resources are hosted on servers, they constitute a base for the internet as we know it. All servers are ruled by different terms of service, governance models and national legislation in relation to privacy and access to data by third actor parties (or "trackers") and are dependent on a variety of business models. This somewhat technical definition can obscure the possibilities for understanding the political aspect behind the setting up and management of a server. In that sense, what would be the purposes13 and principles 14 of a feminist server? Can feminist servers support women, feminists and GLBTQI in their fight for having their rights such as freedom of expression and opinion respected? Can we create trust among us to develop cooperative approaches to the management of those spaces of resistance and transformation? These were more or less the questions that a group of people interested in gender asked themselves during the first Feminist Server Summit 15 in December 2013 and at the first TransHackFeminist (THF!) Convergence 16 held in August 2014. The discussions that emerged out of those meetings recognised that we do not yet have feminist tech collectives that design feminist autonomous infrastructures for the feminist, queer and trans* movement(s) and that this should become a priority. 17

For example, two feminist servers that were dormant re-emerged during the THF! Convergence:

  • The Systerserver project, which was originally launched in early 2000 by the Genderchangers 18 and the Eclectic Tech Carnival (/etc), and focuses on hosting online services such as etherpads and a voice over internet protocol (VoIP) application.

  • The Anarcha server,19 started by the TransHackFeminists from Calafou, an eco-industrial post-capitalist colony located in Catalonia. It hosts a mediawiki, a WordPress farm and a media publishing platform.

These feminist servers are composed by a loose coalition of women, queer and trans* from around the world, with some explicitly interested in hacking heteronormativity and patriarchy. They are also about demonstrating that it is possible to create safe spaces where the harassment of women, feminists and GLBTQI is not allowed and where all can learn about technology in a non-hierarchal and non-meritocratic way. However, even if these server initiatives are inspiring to many, they still remain at the embryonic stage. Moreover, they do not consider themselves service providers; neither have they clearly decided to become stable and sustainable tech collectives providing hosting and online services to women, feminists and GLBTQI groups. In any case, they show that feminist servers are possible and that they should become a political aim for any organisations working in the field of gender social justice and GLBTQI rights – which should be concerned about achieving autonomy in communication and technological infrastructures, in addition to securing their data, social networks and historical memories on the web.

Conclusion

The targeting, silencing and censorship of women, feminists and GLBTQI people online has been and is being challenged in multiple ways. Women, feminists and GLBTQI people have been particularly creative in their everyday forms of resistance and their solidarities and care towards one another. While the initiatives outlined above are exciting, they do remain at an embryonic stage where only a few are able to participate. The reasons why so few initiatives exist ought to be at the core of a feminist analysis to understand how gendered technology actually is. Who is encouraged at a young age to tinker with technology? What kind of division of labour exists when it comes to technology? Why is the level of attrition so high for women in the tech industry?

While seriously considering the above, it remains that if we want to see the Feminist Principles of the Internet as formulated by APC become a reality, we need our own feminist autonomous infrastructures. To do so, we need to have feminist tech collectives that focus on providing these services. We need to be active in developing our expertise and that of the younger generation. But for that to happen we need the feminist and GLBTQI movement(s) to pay more attention to these issues, create more safe spaces to learn collectively, stop fearing technologies and decide collectively that we need to change gears to reshape our own communication and technological infrastructure. After all, freedom of expression is part of the feminist struggle and women, feminists and GLBTQI people can contribute by providing collectively the knowledge and means to ensure that their right to speak up remains accessible online, offline, and wherever and in any format where expression emerges.

References:

1 Barlow, J. P. (1996). A Declaration of the Independence of Cyberspace. https://God.blue/splash.php?url=3o0PIdv_PLUS_2CjF3ZhC6A5dJ2IU2AzSc8Mk3TCBjk5e4AG3VQSSMnP0Bd95run7bZ1Dk6yTYYftyO5EtXHyXFTeBRo_SLASH_2UEheF0go2v3jW2JSGivkUfjY6tsV6bIWqiFwt7fuSt7eYiaaI_PLUS_LMwx6o5PlXA_EQUALS__EQUALS_

2 Doxxing of harassers means searching for and publishing private information about a harasser on the internet with the aim of shaming the individual.

3 An example of the embodiment of everyday forms of resistance is that of feminist social media practices that resist rape culture by hijacking Twitter feeds and hashtags that blame victims and perpetuate myths and stereotypes.

4 See “My shadow” by theTactical Technology Collective: https://God.blue/splash.php?url=CJgtLjuBhzA0gjFzI1nk_PLUS_jYpBXNv_SLASH_Oneg92ABVR2C1KhvMbWjqHDhPOtUYtL0hA4XLcXvDUXt_PLUS_vhqbMuRTrA4fls3DQRckA9SIJgmO_SLASH_HDlw_EQUALS_

5 For a more extensive list of autonomous servers visit: https://God.blue/splash.php?url=_PLUS__SLASH_XA2jiqEv5z7VznNovw8GaonMe8tFa22YNmPYQjWyaVKhUNoehQ0Ihl3jyNedZ7664NvuWvewUZbsCS9tpigvnPbLcTPeBh8epfXjZCDC7DqiNoqaKUniWI5z1m75il and https://God.blue/splash.php?url=SwkMgJoRVTTbMyDpxOrJBbF69_SLASH_cOPOtGtXBYRN_PLUS_p1CxZNZbHVPcu8wrH13yLjWbAflsgZVHG_PLUS_ngbePiaFxI1KdvMFMbUspq3qejlXHRUJA8AVs3CsyO5wGjrdfYZbpTXUGkaFGiy6KFsFEWhiEaQYg_EQUALS__EQUALS_

6 The following is a great activists etherpads that can be used: https://God.blue/splash.php?url=Qijx_SLASH_OW8f5Cg9kcxo_PLUS_vgUedeNj2VBRRsCzHZTCiERxcStYdLhNYdgsnk9QWALZDXmCwKXPJA2oTDITXygQPUztIGuudYNzqoyn1GyX_SLASH_0Ve4_EQUALS_

7 Riseup.net offers VPN to know more visit: https://God.blue/splash.php?url=7NJu0oRHVLkq0rupwt4vrNMZAaIirwqHcPtcXHTnQvCY9c_SLASH_z_SLASH_L5tugh4jHAQHySkwmSFSWlDBCWqOkGkQrfjN71MXCZpFAqwxQQt5KRRm6s_EQUALS_

8 To go to the Geek Feminism Wiki visit: https://God.blue/splash.php?url=GYsVXsoBCL7UGcDz2_PLUS__SLASH_qXppJ_SLASH_YsRV9TT_SLASH_UqGjoPcPjnlkvjrIqW2JAVuBbVGb7Iffv3ZV4PiEG5wGLtV3CQRNIxDqBCk4csabz8544P1sFqm_SLASH_b1LlPwAueWmckls7LMaccKjOx_PLUS_53v2xCexu0iZIcg_EQUALS__EQUALS_

9 Adam, A. (2003). Hacking into Hacking: Gender and the Hacker Phenomenon. ACM SIGCAS Computers and Society, 33(4).

10 Nakamura, L. (2014). Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture. American Quarterly, 66(4), 919-941.

11 To know more visit: https://God.blue/splash.php?url=oV7jONJF5JS_PLUS_lIFcvp7rGO6nGJtzB_PLUS_RfJ5REb_PLUS_h1nnRE85acXkwLCEyfH8OaWBunblC4ll14nj5p94Bhxq0_PLUS_u2jsF4ff38695xg8ELZzD6M_EQUALS_/gender-tech-institute

12 One example is the International Feminist Hackathon Day (a.k.a. FemHack) held on 23 May 2015. To know more about this initiative see: www.f3mhack.org

13 For a history of where the desire for feminist servers arose read: Alarcon, S. et al. (2015, 30 April). Exquisite Corpse. New Criticals. www.newcriticals.com/exquisite-corpse/page-8

14 Following discussions at the Feminist Server Summit, Femke Snelting came up with a list that defines what a feminist server is, available here: https://God.blue/splash.php?url=9_PLUS_qnf2xJAg_PLUS_WgRDbg7RzH16Dpy7uXkwdh2CmXousdhqoigujxNyBwI_SLASH_IeliRD9vtfuPfVDu_PLUS_uaEeqDHxaj0kfcpEtj7sZtB36_SLASH_0aK3HA7naCaA5ElZZBSpzdc2mWS47S

15 vj14.constantvzw.org

16 transhackfeminist.noblogs.org/post/2015/01/25/a-transhackfeminist-thf-convergence-report and anarchaserver.org/mediawiki/index.php/Main_Page

17 The theme of the second edition of the TransHackFeminist (THF!) Convergence is aptly titled “Error 404. Dissent Technologies Not Found”: transhackfeminist.noblogs.org

18 A video about the GenderChangers is available at: https://God.blue/splash.php?url=T5Q7mO6q68qrSqU5LN7qHwRUx3BVEZWODez4f1XWsRqUr8X_SLASH_ugDb823zTa_PLUS_WjrdXZyi4MKjDeDxzVLqPAG_SLASH_sjVU7kj3JWpvXHcC3KJhY7dY_EQUALS_

19 anarchaserver.org

Themes: 


          

La DGOJ de España participó de la reunión de juego responsable del Gambling Regulators European Forum    

Cache   

El último miércoles se llevó a cabo la reunión del grupo de trabajo de juego responsable del GREF en la sede de la Autoridad Holandesa del Juego, en La Haya; y los asistentes pudieron conocer más acerca de los avances en materia de juego responsable por parte del regulador holandés, la utilización del Big Data y la inteligencia artificial aplicada al juego, y los mecanismos de recompensa a jugadores en términos de juego responsable, entro otros temas destacados.

          

MDH bjuder på AI-kurs för processindustrin   

Cache   

Big Data, AI och maskininlärning (ML) borde kunna trimma svensk processindustri, tänker Mälardalens högskola (MDH), och lägger upp en gratiskurs på webben.

Läs mer...


          

Data Analyst, Advanced Analytics   

Cache   

LOCATION Calgary Alberta (CA-AB) JOB NUMBER 32430 Why you should join us We’re experiencing an exciting time at Suncor we are working to apply digital technologies to accelerate operational excellence to help us achieve world-class performance generate value drive and enhance our competitive advantage and create the workplace of the future As part of this evolution what we call Suncor 4 0 we’ve started using machine learning robotics AI and remote sensing technology and we know that our journey into the digital world will only accelerate and we need your help The next phase in our company’s evolution is about unleashing the full potential of our people and our company to work differently harnessing emerging technology and new digital capabilities and developments that are transforming our world We have a fantastic role for someone early on in their data analytics career Join our team during this transformative time and build your career with us You will be mining data creating visuals and graphic representations to convey a powerful insightful message while supporting our organization in making meaningful decisions based on advanced data You will use your expertise to - Participate in product development from ideation to full deployment - Perform exploratory data analysis - Work closely with user experience and interface designer to assess implementation efforts and influence solution design - Translate mockups and wireframes into informative visuals in multiple formats (e g pbix D3 js avi wmv psd interactive notebook and others) - Assist in the creation of advanced analytics models and product development - Design and execute performance tests of front end applications in a big data context - Work with advanced analytics teams to make smarter products - Maintain quality and ensure responsiveness of front end applications - Create and maintain documentation for support and future enhancements - Interact with Center of Excellence (COE) teams Analytics Specialists and Engineers in the creation of models - Apply numerical analysis exploratory data analytics engineering principles mathematical and other data techniques to business problems solvable through data-driven decisions We’d like to review your application if you have… Must-haves (minimum requirements) - One to three years of experience in a quantitative field with strong knowledge of data and analytics - Proven experience with relational database and writing SQL code - Experience with analytics scripting languages such as Python R - A Bachelor’s degree with a focus in Computer Science Analytics or Computer Information Systems - Knowledge and experience with Microsoft Power BI and other visualization tools - Alignment with our values of safety above all else respect raise the bar commitments matter and do the right thing Preference for - Experience working in an AI startup environment or organizations with an agile culture - Experience and interest in visual and graphic design - An open mind to new approaches and learning - A professional attitude and service orientation superb team player Where you’ll be working your work schedule and other important information - You will work out of our Calgary head office located in the Suncor Energy Centre at 150 – 6th Ave S W - Hours of work are a regular 40-hour work week Monday to Friday with the potential for extended work hours based on business needs Why Suncor We are Canada s leading integrated energy company with a business portfolio that includes oil sands development and upgrading offshore oil and gas production petroleum refining and product marketing under the Petro-Canada brand Our global presence offers rewarding opportunities for you to learn contribute and grow in a variety of career-building positions We live by the value of safety above all else – do it safely or don’t do it Our st

          

Computer Science Student   

Cache   

LOCATION Various Alberta (CA-AB) JOB NUMBER 29735 Why you should join us Student positions at Suncor (including co-op and intern) are more than just a work term They provide you with significant and meaningful work experiences to help enable you to figure out where you want your career to go – better yet many of our co-op students go on to become permanent Suncor employees after they graduate You will work side-by-side with some of the most talented people in the energy industry and your work assignments and mentoring will offer you outstanding academic and career growth You will have the opportunity to put your academic knowledge into practice and leave your mark within a large integrated organization   You will be given the opportunity to be involved with new technology and to work with large data sets to help streamline work   You will use your expertise to - Participate in the development of solutions of technical issues and in the verification of implementation of these solutions - Understand and optimize current operations and develop tools for tracking process data for analysis and metrics reporting - Craft and present summary reports performance metrics reports graphical representation and presentations as required - Sort file and organize various databases We’d like to review your application if you have… Must-haves (minimum requirements) - Current enrollment in a formal co-op or internship program in 3rd or 4th year Computer Science Data Science or related program at an accredited post-secondary institution and returning to full-time studies after the work term - (Attach your transcripts with your application and clearly indicate the length of work term for which you are available) - Alignment with our values of safety above all else respect raise the bar commitments matter and do the right thing Preference for - Proven computer application knowledge with advanced understanding of Excel (pivot tables v-lookups advanced filters) and PowerPoint - A deep understanding of analytical techniques quantitative problem solving and data manipulation - An understanding of Power BI and other technologies used to handle big data and SAP - Previous experience in a data science or an analytics environment - A continuous improvement mindset and are able to seek greater knowledge and understanding of the systems process and hazards in the workplace - A zero-tolerance for shortcuts and if the procedure is incorrect you fix it Where you’ll be working your work schedule and other meaningful information - You will be based in Calgary Edmonton Fort McMurray or Fort Hills - This is an 8 – 12 month work term from January 2020 – December 2020 Why Suncor Start your career at Canada s leading integrated energy company with a business portfolio that includes oil sands development and upgrading offshore oil and gas production petroleum refining and product marketing under the Petro-Canada brand Our global presence offers rewarding opportunities for you to learn and grow in a variety of career-building positions We live by the value of safety above all else – do it safely or don’t do it Our strong track record of growth and a focus on sustainability mean tremendous potential for the future  Learn about our mission vision and values Stay connected to us - Follow us on LinkedIn  Facebook and Twitter for the latest job postings and news - Join our Talent Community and sign up to receive customized job alerts - Read our Suncor Connections newsletter to see what we’re doing in the communities we live and work in We are an equal opportunity employer and encourage applications from all qualified individuals We are committed to providing a diverse and inclusive work environment where every employee feels valued and respected We will consider accessibility accommodations to applicants upon request Check out our social goal to learn how we are wo

          

InterSystems and Inspur Enter Agreement to Innovate Healthcare Big Data Platforms in China   

Cache   

InterSystems, a global leader in information technology platforms for health, business, and government applications, announced an agreement for strategic cooperation, signed at its recent Global Summit, with leading Chinese cloud computing and big data service provider, Inspur. Inspur will implement its healthcare big data platform using the InterSystems IRIS for Health™ data platform, and the […]

          

Project Manager (f/m) SAP Innovative Business Solutions - SAP - Walldorf   

Cache   

We make innovation real by using the latest technologies around the Internet of Things, blockchain, artificial intelligence / machine learning, and big data and…
Gefunden bei SAP - Tue, 29 Oct 2019 18:37:42 GMT - Zeige alle Walldorf Jobs

          

Global Program Manager (f/m/d) SAP Innovative Business Solutions - SAP - Walldorf   

Cache   

We make innovation real by using the latest technologies around the Internet of Things, blockchain, artificial intelligence / machine learning, and big data and…
Gefunden bei SAP - Thu, 24 Oct 2019 18:37:28 GMT - Zeige alle Walldorf Jobs

          

Developer/Senior Developer (m/f/d) SAP S/4 HANA - SAP Innovative Business Solutions - SAP - Walldorf   

Cache   

We make innovation real by using the latest technologies around the Internet of Things, blockchain, artificial intelligence / machine learning, and big data and…
Gefunden bei SAP - Fri, 18 Oct 2019 12:37:28 GMT - Zeige alle Walldorf Jobs

          

Senior Project Manager (m/f/d) SAP Innovative Business Solutions - SAP - Walldorf   

Cache   

We make innovation real by using the latest technologies around the Internet of Things, blockchain, artificial intelligence / machine learning, and big data and…
Gefunden bei SAP - Mon, 14 Oct 2019 18:37:16 GMT - Zeige alle Walldorf Jobs

          

IT / Software / Systems: Senior Data Engineer - SQL / Redshift / AWS - Premier Ecommerce Publishing Brand - Los Angeles, California   

Cache   

Are you a Senior Data Engineer with a strong SQL, ETL Redshift and AWS background seeking an opportunity to work with massive amounts of data in a very hip marketplace? Are you a Senior Data Engineer interested in unifying data across various consumer outlets for a very well-funded lifestyle brand in the heart of Santa Monica? Are you an accomplished Senior Data Engineer looking for an opportunity to work in a cutting-edge tech environment consisting of; SQL, Redshift, Hadoop, Spark, Kafka and AWS? If yes, please continue reading.... Based in Santa Monica, this thriving lifestyle brand has doubled size in the last year and keeps on growing With over $75 million in funding, they work hard to provide their extensive audience with advice and recommendations in all things lifestyle: where to shop, eat, travel, etc. Branching into a number of different services and products over the next 12 months, they are building out their Engineering team. They are looking for a Senior Data Engineer to unify and bring to life mass amounts of data from all areas of the business; ecommerce, retail, content, web, mobile, advertising, marketing, experiential and more. WHAT YOU WILL BE DOING: Architect new and innovative data systems that will allow individuals to use data in impactful and exciting ways Design, implement, and optimize Data Lake and Data Warehouses to handle the needs of a growing business Build solutions that will leverage real-time data and machine learning models Build and maintain ETL's from 3rd party sources and ensure data quality Create data models at all levels including conceptual, logical, and physical for both relational and dimensional solutions Work closely with teams to optimize data delivery and scalability Design and build complex solutions with an emphasis on performance, scalability, and high-reliability Design and implement new product features and research the next wave of technology WHAT YOU NEED: Extensive experience and knowledge of SQL, ETL and Redshift Experience wrangling large amounts of data Skilled in Python for scripting Experience with AWS Experience with Big Data tools is a nice plus; Hadoop, Spark, Kafka, Ability to enhance and maintain a data warehouse including use of ETL tools Successful track record in building real-time ETL pipelines from scratch Previous Ecommerce or startup experience is a plus Understanding of data science and machine learning technologies Strong problem solving capabilities Strong collaborator and is a passionate advocate for data Bachelor's Degree in Computer Science, Engineer, Math or similar WHAT YOU GET: Join a team of humble, creative and open-minded Engineers shipping exceptional products consumers love to use Opportunity to work at an awesome lifestyle brand in growth mode Brand new office space, open and team oriented environment Full Medical, Dental and Vision Benefits 401k Plan Unlimited Vacation Summer vacations / Time off Offices closed during winter holidays and new years Discounts on products Other perks So, if you are a Senior Data Engineer seeking an opportunity to grow with a global lifestyle brand at the cusp of something huge, apply now ()

          

Application Systems Engineer 6 - Lead Developer   

Cache   

Job Description
Important Note: During the application process, ensure your contact information (email and phone number) is up to date and upload your current resume prior to submitting your application for consideration. To participate in some selection activities you will need to respond to an invitation. The invitation can be sent by both email and text message. - In order to receive text message invitations, your profile must include a mobile phone number designated as "Personal Cell" or "Cellular" in the contact information of your application.At Wells Fargo, we want to satisfy our customers' financial needs and help them succeed financially. We're looking for talented people who will put our customers at the center of everything we do. Join our diverse and inclusive team where you'll feel valued and inspired to contribute your unique skills and experience.Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you.Wells Fargo Technology sets IT strategy; enhances the design, development, and operations of our systems; optimizes the Wells Fargo infrastructure footprint; provides information security; and enables continuous banking access through in-store, online, ATM, and other channels to Wells Fargo's more than 70 million global customers.The Wholesale Technology group is seeking a Senior technology lead developer that acts in the highest-level technical role as an individual contributor and/or team lead for the most complex computer applications and/or application initiatives. Utilizes a thorough understanding of available technology, tools, and existing designs. Works on the most complex problems where analysis of situations or data requires evaluation of intangible variance factors. Plans, performs, and acts as the escalation point for the most complex platform designs, coding, and testing. Leads most complex multiple modeling, simulations, and analysis efforts. Acts as expert technical resource to programming staff in the program development, testing, and implementation process.
  • Analyze software requirements and highly complex user needs to determine feasibility of design within time and cost constraints.
  • Fill a technical lead role on strategic projects within the organization with an ability to transfer knowledge to members of the team and act as an expert, value added resource.
  • Proven problem solving abilities with a history of building reliable and high performance software solutions.
  • Assure quality, Security and compliance requirements are met for supported areas.
  • Lead and coordinate maintenance of software systems and installation/upgrade of software systems.
  • Work on an Agile and collaborative environment with Business partners, technology architects and other technology partners.
  • Develops, executes and manages the maintenance of security plans, risk assessments, recovery planning, and incident management, testing procedures, training and reporting on the execution of deliverables designed for program maturity.
  • Expert knowledge of Jira for project management tracking (Epic, Story, and Issue management)
  • Develop original and innovative solutions to complex challenges and provides coding guidance to less experienced staff.
  • Be well-informed in general technology trends (Cloud, Web technologies etc.,) and be able to make recommendations based on our business needs.
  • Ability to adapt and learn new emerging technologies.
  • Contributes to Dev Ops, Test Automation and automated build and deployments.
  • Champions continuous improvements in the organization.

    Required Qualifications
    • 10+ years of application development and implementation experience
    • 7+ years of Tableau experience
    • 7+ years of relational database experience
    • 7+ years of experience delivering Business Intelligence (BI), analytics and reporting using API Services architecture
    • 5+ years of UI (User Interface) experience
    • 3+ years of Java or Python experience
    • 3+ years of configuration experience with Cloud service providers such as Amazon Web Services (AWS), Google Cloud Platform (GCP) or MS Azure
    • 4 + years of experience building predictive models
    • 3+ years of Big Data experience
    • 2+ years of web development experience using version 2.0 of Angular or greater
    • 7+ years of PL/SQL experience

      Desired Qualifications
      • Excellent verbal, written, and interpersonal communication skills
      • 4+ years of MS SQL server experience
      • 3+ years of business intelligence experience
      • Ability to develop trending reporting and other internal reporting tools
      • Experience designing and optimizing complex SQL queries involving table joins and correlated sub-queries on large scale data tables
      • 3+ years of analytics experience
      • 3+ years of financial industry experience
      • 3+ years of experience with Waterfall and Agile project methodologies
      • 4+ years of .net experience
      • 4+ years of ASP.Net experience
      • A BS/BA degree or higher in information technology
      • Good analytical skills with high attention to detail and accuracy
      • Ability to be flexible and adjust plans quickly to meet changing business needs
      • Ability to grasp complex business issues quickly, recommend solutions, and drive for resolutions
      • Ability to influence, partner, and negotiate with senior business leaders to gain commitment to accomplish business goals
      • Ability to quickly and accurately execute tactical deliverables
      • Ability to lead during times of ambiguity and change
      • Ability to partner as a team member resource
      • Ability to take initiative, identify opportunities and implement change
      • Ability to translate business needs into complex analysis, designs and recommendations. Uses analysis to identify and define business requirements, while supporting the validity of the final product.

        Job Expectations
        • Flexibility to work in a 24/7 environment, including weekends and holidays
        • Position Hours: Monday - Friday (as early as 7 am and as late as 6 pm ).
        • Ability to work additional hours as needed

          Street Address
          NC-Charlotte: 401 S Tryon St - Charlotte, NC


          Disclaimer

          All offers for employment with Wells Fargo are contingent upon the candidate having successfully completed a criminal background check. Wells Fargo will consider qualified candidates with criminal histories in a manner consistent with the requirements of applicable local, state and Federal law, including Section 19 of the Federal Deposit Insurance Act.

          Relevant military experience is considered for veterans and transitioning service men and women.
          Wells Fargo is an Affirmative Action and Equal Opportunity Employer, Minority/Female/Disabled/Veteran/Gender Identity/Sexual Orientation.

          

The Subtle Art of Simultaneously Being Radical and Evolutionary   

Cache   

DDN’s mission comes down to satisfying, nay delighting our 2500-strong customer base in their highly complex, demanding and evolving requirements. In AI, Big Data and HPC that is no mean feat...

The post The Subtle Art of Simultaneously Being Radical and Evolutionary appeared first on DDN.com.


          

Research Associate   

Cache   

Position Information General Information Position Number ****** Vacancy Open to All Candidates Working Title Research Associate Position Designation EHRA Non-Faculty Employment Type Time Limited - Full-time Months per Year 12 Work Schedule Monday thru Friday 8 -5 Hours per week 40 FLSA Status Exempt Division Academic Affairs Department Coll of Computing & Informatics (Col) Work Location NC Research Campus, Kannapolis, NC Salary Range Primary Purpose of Department The mission of the Bioinformatics & Genomics Department is to develop novel computational approaches to important biological problems, and to provide training in the science that underlies them. Primary Purpose of Position We integrate and analyze a variety of big data sets from biomedical research & applications. And we develop statistical/analytical methods, and implement them into software packages or web servers. Some of them are widely used by tens of thousands of scientists over the world: ******************************************* *************************************** ************************** We apply big data and advanced analytics to solve important and complex problems in biomedicine. For example, we recently were working on autism, a complex brain disorder, using big data + novel analytics approach ******************************************************** Summary of Position Responsibilities Develop statistical and computational methods for big data analysis, integration and visualization. S/he will work on a range of high throughout omics (genome, transcriptome, metabolome etc) data, and cutting edge projects on complex diseases and biomedical problems. Minimum Education/Experience Education: PhD (or Master + 3 years working experience) in bioinformatics, computer science, statistics or related fields. Preferred Education, Knowledge, Skills and Experience Technical skills: Solid statistics training Genetics/genomics data analysis (esp. GWAS, Whole Genome/Exome Studies), NGS data analysis, sequence analysis R/Bioconductor Unix/Linux shell Python or Perl Version control: svn or git R package (or software) development is a plus -Other qualifications: Excellent communication and problem-solving skills, attention to detail Ability to work independently and in a team Self-motivated and disciplined, time and project management skills Enjoy computational/statistical method development, data analysis Proven research/development experience, publication records Necessary Certifications/Licenses Preferred Certifications/Licenses Special Notes to Applicants Finalist will be subject to a criminal background check. Posting Open Date 08/29/2019 Posting Close Date Open Until Filled Yes Proposed Hire Date 09/16/2019 If time-limited please indicate appointment end date 06/30/2020 Contact Information *******************

          

Associate Counsel   

Cache   

-------------------------

ASSOCIATE COUNSEL

Location:
CARY, NC

Employment Duration:
FULL TIME

-------------------------

DESCRIPTION

Global Knowledge is the world s leading IT and business skills training provider. Offering the most relevant and timely content delivered by the best instructors, we provide customers around the world with their choice of convenient class times, delivery methods and formats to accelerate their success. Our business skills solutions teach essential communications skills, business analysis, project management, ITIL service management, process improvement and leadership development. With thousands of courses spanning from foundational training to specialized certifications, our core IT training is focused on technology partners such as Amazon Web Services, Cisco, Citrix, IBM, Juniper, Microsoft, Red Hat and VMware. We offer comprehensive professional development for technologies like big data, cloud, cybersecurity and networking.

SUMMARY Provide general legal support for the Company s worldwide operations.



ESSENTIAL DUTIES AND RESPONSIBILITIES

Provide general legal services to the Company. Focus will be negotiation and drafting of commercial contracts and oversight of some of the Company s compliance functions. Other areas of support will include the following legal area: general corporate (US & International), trademark, labor, mergers & acquisitions, banking, real estate, and licensing.



SUPERVISORY RESPONSIBILITIES None



QUALIFICATIONS - To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.



EDUCATION and/or EXPERIENCE REQUIREMENTS:

JD

NC BAR ADMISSION PREFERRED, BUT NOT IMMEDIATELY REQUIRED.

0-4 YEARS LEGAL EXPERIENCE (IN-HOUSE OR PRIVATE PRACTICE).



COMPETENCES - In addition to whether a person has the knowledge and experience required to be successful in the job, one must have specific behaviors to perform the job successfully. The competencies listed below are representative of the behaviors required.



*

Technical/Professional Knowledge
*

Critical Thinking Skills
*

Decision Making
*

Quality Work
*

Managing Work
*

Follow Up



PHYSICAL DEMANDS The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.



CHANGE PROVISIONS

This job description describes the essential functions of this position. These functions may be changed or reassigned at any time due to reasonable accommodations or other reasons.



Global Knowledge is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status.

          

Big Data Engineer   

Cache   

- Job Type(s) Full Time - Salary($) 128,598 - 174,777 / year - Industry Technology - Job Description Big Data Engineer Location: Silicon Valley, CA Duration: 12+Months Job Description: JOB Summary: The Apple Media Products Analytics Engineering is looking for a top-notch Big Data engineer to develop

          

Taking a reality check on Procurement 4.0   

Cache   

Alex Saric, smart procurement expert, Ivalua, discusess the reality of procurement 4.0. Procurement 4.0 has been part of the Spend Management conversation for years; businesses have been working towards digitised smart procurement ecosystems that combine cloud, big data analytics, and automation ...

          

Data Scientist Engineer   

Cache   

Octo is currently seeking a Platform Security Engineer to join a growing team on an exciting and highly visible project for a DoD customer.The project you will be working is to define and design the data architecture and taxonomy in preparation for conducting extensive analysis of the data ingested via Air Force existing legacy applications to a more evolvable architecture that can better leverage a cloud environment to deliver better technology, reduce program sustainment costs, and higher system reliability. Our approach is to transform legacy applications to be cloud native and reside on a Platform as a Service (PaaS).?Additionally, modernize current applications by breaking them down into loosely coupled micro-services, and leveraging a continuous integration / continuous delivery pipeline to enable an agile DevOps Strategy.Octo Data Scientists on this project will have an opportunity to ?receive 6+ months of Pivotal Cloud Foundry training as part of the standard on-boarding process for this project.You?As a Data Scientist at Octo, you will be involved in the analysis of unstructured and semi-structured data, including latent semantic indexing (LSI), entity identification and tagging, complex event processing (CEP), and the application of analysis algorithms on distributed, clustered, and cloud-based high-performance infrastructures. Exercises creativity in applying non-traditional approaches to large-scale analysis of unstructured data in support of high-value use cases visualized through multi-dimensional interfaces. Handle processing and index requests against high-volume collections of data and high-velocity data streams. Has the ability to make discoveries in the world of big data.??Requires strong technical and computational skills??- engineering, physics, mathematics,??coupled with??the ability??to code design, develop, and deploy sophisticated applications using advanced unstructured and semi-structured data analysis techniques and utilizing high-performance computing environments. Has the ability to utilize advance tools and computational skills to interpret, connect, predict and make discoveries in complex data and deliver recommendations for business and analytic decisions.??Experience with software development, either an open-source enterprise software development stack (Java/Linux/Ruby/Python) or a Windows development stack (.NET, C#, C++). Experience with data transport and transformation APIs and technologies such as JSON, XML, XSLT, JDBC, SOAP and REST. Experience with Cloud-based data analysis tools including Hadoop and Mahout, Acumulo, Hive, Impala, Pig, and similar. Experience with visual analytic tools like Microsoft Pivot, Palantir, or Visual Analytics. Experience with open source textual processing such as Lucene, Sphinx, Nutch or Solr. Experience with entity extraction and conceptual search technologies such as LSI, LDA, etc. Experience with machine learning, algorithm analysis, and data clustering.Us?We were founded as a fresh alternative in the Government Consulting Community and are dedicated to the belief that results are a product of analytical thinking, agile design principles and that solutions are built in collaboration with, not for, our customers. This mantra drives us to succeed and act as true partners in advancing our client?s missions.What we?d like to see?
  • Full-stack software development experience with a variety of server-side languages such as Java, C#, PHP, or Javascript (NodeJS)
  • Experience with modern front-end frameworks like React, Vue, or Angular
  • Intimate knowledge of agile and lean philosophies and experience successfully leading software teams in the practice of these philosophies
  • Experience with Continuous Delivery and Continuous Integration techniques using tools like Jenkins or Concourse
  • Experience with test-driven development and automated testing practices
  • Experience with data analytics, data science, or data engineering, MySQL and/or Postgres, GraphQL, Redit, and/or Mongo
  • Experience with building and integrating at the application and database level REST/SOAP APIs and messaging protocols and formats such as Protobuf, gRPC, and/or RabbitMQ
  • Experience with Pivotal Cloud Foundry
  • Experience with Event/Data Streaming services such as Kafka
  • Experience with Enterprise Service Bus and Event Driven Architectures
  • Experience with prototyping front-end visualization with products such as ElasticStack and/or Splunk
  • Strong communication skills and interest in a pair-programming environment Bonus points if you?
  • Possess at least one of the Agile Development Certifications
    • Certified Scrum Master
    • Agile Certified Practitioner (PMI-ACP)
    • Certified Scrum Professional
    • Have proven experience writing and building applications using a 12-factor application software architecture, micro services, and API
    • Are able to clearly communicate and provide positive recommendation of improvements to existing software applicationsYears of Experience:??5 years or moreEducation:?Associates in a Technical Discipline ? Computer Science, Mathematics, or equivalent technical degreeClearance:?SECRET

          

Industrial Security Rep III/FSO   

Cache   

Summary / DescriptionWe are currently seeking a motivated, career and customer oriented Industrial Security Rep III/FSO to join our team to begin an exciting and challenging career with Unisys Federal Systems.--Position Summary/Responsibilities:FSO duties include administering and maintaining company security policies in accordance with NISPOM policies. Must possess a broad knowledge of the National Industrial Security Program rules, regulations, and procedures. Performs a variety of security related tasks. Will report to the Director on all tasks and interface with DCSA on security clearance matters, rulings, and interpretations of the National Industrial Security Program Operating Manual (NISPOM) and execution of all assigned classified contracts.Position Duties:--- Review coordinate and publish (amend as needed) security program governance documentation e.g. policies, procedures, and Standard Practice Procedures (SPP) manuals--- Maintenance of FCL documentation in NISS--- Process outgoing visit authorizations for personnel assigned to contracts that require cleared access--- Validate the personnel security clearance of new personnel. If required, initiate action for the proper investigation to be conducted.--- Define, update, maintain and produce reports from JPAS (Joint Personnel Adjudication System) database--- Administer Security Education/Training to cleared personnel--- Conduct briefing/debriefing of individuals requiring access as required. --- Maintain an ongoing security education program, e.g. design updates to annual online training, publish security awareness notices, participate in Security Awareness venues--- Administers processes to obtain and maintain Facility Clearance Levels (FCL) and Personnel Clearance Levels (PCL); performs maintenance of appropriate and accurate security folders--- Participates in security self-assessments, and DCSA (Formerly DSS) Vulnerability Assessments, and all other audits--- Aid in development and dissemination of Insider Threat Awareness program as required by the NISPOM.--- Advise personnel of their reporting requirements, both personal/administrative and compliance/incident-related reports--- Prepare DD-254 documentation and appropriate security guidance for subcontractors--- Conduct reviews of and recommends modifications to Prime DD Form 254s and Visitor Group Security Agreements (VGSA)--- Prepare Facility Clearance (FCL) Sponsorship documentation for subcontractors requiring new FCLs--- Prepares and authorizes courier orders and cards up to the Top Secret Level--- Maintain access control systems including badging, access levels, visitor badges, and temporary badges.--- Maintain secure and closed areas with IS--RequirementsBachelor---s Degree in related field with 10-12 Years of Experience as an FSO CPP, ISP or other security related certification preferred Strong working knowledge of National Industrial Security Operating Manual (NISPOM) and ICD 705Expert knowledge of JPAS/DISS, NISS Expert Knowledge of Safeguarding ProceduresExpert knowledge of SF-86, SF-312, DD-54, and other required forms.Active DoD Top Secret Clearance with SCIStrong working knowledge of SIPRNet/Closed Areas/Secured Spaces (in construction and maintenance)About UnisysDo you have what it takes to be mission critical? Your skills and experience could be mission critical for our Unisys team supporting the Federal Government in their mission to protect and defend our nation, and transform the way government agencies manage information and improve responsiveness to their customers. --As a member of our diverse team, you---ll gain valuable career-enhancing experience as we support the design, development, testing, implementation, training, and maintenance of our federal government---s critical systems. Apply today to become mission critical and help our nation meet the growing need for IT security, improved infrastructure, big data, and advanced analytics.Unisys is a global information technology company that solves complex IT challenges at the intersection of modern and mission critical. We work with many of the world's largest companies and government organizations to secure and keep their mission-critical operations running at peak performance; streamline and transform their data centers; enhance support to their end users and constituents; and modernize their enterprise applications. We do this while protecting and building on their legacy IT investments. Our offerings include outsourcing and managed services, systems integration and consulting services, high-end server technology, cybersecurity and cloud management software, and maintenance and support services. Unisys has more than 23,000 employees serving clients around the world. Unisys offers a very competitive benefits package including health insurance coverage from first day of employment, a 401k with an immediately vested company match, vacation and educational benefits. To learn more about Unisys visit us at www.Unisys.com.Unisys is an Equal Opportunity Employer (EOE) - Minorities, Females, Disabled Persons, and Veterans.#FED#

          

AI/ML Executive Architect   

Cache   

Summary / DescriptionUnisys is seeking candidates to make a difference by providing meaningful solutions to help our government secure the nation and fulfill the mission of government most effectively and efficiently. We are looking for candidates for Artificial Intelligence/Machine Learning Executive Architect role for our corporate office in Reston, VA.--The role of an AI/ML Solution Executive includes:--- Educates Unisys Federal Delivery Leadership, our existing clients and prospects as to emerging opportunities to apply AI/ML analytics to better leverage government data to make more timely and better mission decisions--- Provides the AI/ML vision for Unisys Federal--- Participates actively in providing technical leadership for AI/ML opportunities in the new business development cycle from deal identification, participating in call plans, driving solution strategy, in responding to a solicitation and in participating in tech challenges/hackathons to showcase our AI/ML skills--- Working with Unisys business development, program teams, capture and account teams to engage customers to best understand their AI/ML needs and to present Unisys capabilities, offerings and solutions in a compelling manner, thereby shaping customer perspectives --- Leads the establishment and sustainment of Unisys portfolio of capabilities for AI/ML, including marketing literature, proposal content, BoE/rate cards, proof points, reference architectures and proofs of concept/demoware--- Provides deep domain expertise regarding AI/ML, data modeling, enterprise data warehousing, data integration, data quality, master data management, statistical analyses of primarily structured datasets--- Provides deep domain expertise of AI/ML algorithms, tooling and solutions to solve mission problems for Unisys Federal clients--- Provides expertise in building government oriented solutions leveraging NoSQL solutions, big data (Hadoop/Apache Spark), Geographic Information Systems (GIS), key-value pair, columnar, graph, search, natural language processing, data science, machine learning and data visualization --- Drives market demand for AI/ML solutions by providing concise messages tailored for Unisys customers and their desired outcomes--- Defines our go to market strategy for AI/ML --- Collaborates closely with our corporate solutions organizations and alliance partners to incubate, design and deliver AI/ML offerings--- Curates proof points and past performance qualifications for Unisys success stories for applying AI/ML capabilities supporting the mission of government--- Identifies market trends in technology for AI/ML solutions--- Collaborates with Unisys Commercial Solutions organizations to prioritize corporate investments in AI/ML solutions--- Works with business units in tailoring capability strategies specific for them and work with appropriate government relationships to shape agency procurement--- Shapes procurements through presentations to clients and other speaking engagements --- Determines which alliances to pursue and events for Unisys to participate The AI/ML Executive is intimately familiar with market trends, helps to define go to market strategy and ensure that Unisys is in a position to be the best choice for meeting our customers--- AI/ML needs through collaboration with customers, partners, and internal stake holders to understand the requirements and connect them with Unisys capabilities and offerings.RequirementsRequired Skills:--- Master's degree and 20 years of relevant experience or equivalent--- Strong expertise in designing and delivering AI/ML/Deep Learning solutions--- Expertise and experience implementing technology solutions in four or more of the following areas: database design, data warehousing, data governance, metadata management, big data, noSQL, data science, data analytics, machine learning, natural language processing, streaming data.--- Experience with scientific scripting languages (e.g. Python, R) and object oriented programming languages (e.g. Java, C#) --- Strong expertise with machine learning and deep learning models and algorithms--- Solid grounding in statistics, probability theory, data modeling, machine learning algorithms and software development techniques and languages used to implement analytics solutions--- Deep experience with data modeling and Big Data solution stacks--- Deep knowledge in enterprise IT technologies, including databases, storage, and networks--- Deep experience with one or more Deep Learning frameworks such as Apache MXNet, TensorFlow, Caffe2, Keras, Microsoft Cognitive Toolkit, Torch and Theanu--- Has a successful track record in providing technical leadership in federal new business pursuits --- In-depth understanding of application, cloud, middleware, data management and system architecture concepts; experience leading the design and integration of enterprise-level technical solutions. --- Experience in capturing technical requirements and defining technical solutions in the form of conceptual, logical, and physical designs, including the ability to articulate those concepts verbally, graphically and in writing. --- Ability to synthesize solution design information, architectural principles, available technologies, third-party products, and industry standards to formulate a system architecture that meets client requirements and can be delivered within the desired timeframe. --- Experience developing cost models, technical delivery plans, technical solutions and basis of estimates (BOEs), including BOM development. Also develop concept of operations and discuss these models in Agile, federal SDLC or ITIL based terms. --- Experience identifying potential design, performance, security, and support problems, including ability to identify technical risks/challenges and develop relevant mitigation strategies. --- Extensive knowledge of the broad spectrum of technology areas, including technology trends, forthcoming industry standards, new products, and the latest solution development techniques; ability to leverage this knowledge to formulate technical solution strategy. --- Ability to consistently apply architectural guidelines when creating new solution architectures. --- Ability to develop integrated technology requirements project plan. --- Ability to interface with team members at all levels, including business operations, finance, technology, and management. --- Was primary author for a technical conference or whitepaper submission. (to be provided)Desired Qualifications --- Certifications from leading analytics platform providers (Cloudera, Horton, Databricks, AWS, Microsoft, etc.) --- Experience in leading remote teams in building demonstrations and proofs of concept--- Experience in classical DMBOK data management practices including data governance, data quality management, master data management, metadata management practices and tools--- Deep knowledge of Federal domain-specific data formats and structures, data storage, retrieval, transport, optimization, and serialization schemes--- Demonstrated experience developing engineering solutions for both structured and unstructured data, including data search. --- Experience working with very large (petabyte scale) datasets including data integration, analysis and visualization--- Experience with data integration and ETL tools (e.g. Apache NiFi, SSIS, Informatica, Talend, Azure Data Factory)--About UnisysDo you have what it takes to be mission critical? Your skills and experience could be mission critical for our Unisys team supporting the Federal Government in their mission to protect and defend our nation, and transform the way government agencies manage information and improve responsiveness to their customers. --As a member of our diverse team, you---ll gain valuable career-enhancing experience as we support the design, development, testing, implementation, training, and maintenance of our federal government---s critical systems. Apply today to become mission critical and help our nation meet the growing need for IT security, improved infrastructure, big data, and advanced analytics.Unisys is a global information technology company that solves complex IT challenges at the intersection of modern and mission critical. We work with many of the world's largest companies and government organizations to secure and keep their mission-critical operations running at peak performance; streamline and transform their data centers; enhance support to their end users and constituents; and modernize their enterprise applications. We do this while protecting and building on their legacy IT investments. Our offerings include outsourcing and managed services, systems integration and consulting services, high-end server technology, cybersecurity and cloud management software, and maintenance and support services. Unisys has more than 23,000 employees serving clients around the world. Unisys offers a very competitive benefits package including health insurance coverage from first day of employment, a 401k with an immediately vested company match, vacation and educational benefits. To learn more about Unisys visit us at www.Unisys.com.Unisys is an Equal Opportunity Employer (EOE) - Minorities, Females, Disabled Persons, and Veterans.#FED#

          

Sr. Splunk Delivery Engineer   

Cache   

Job ID: ******** Updated: Oct 16, 2019 Location: Toledo, OH, United States The Senior Splunk Delivery Engineers at CDWG team with other Engineers on highly visible, technically challenging projects and offerings. This engineer will be responsible for designing, configuring, maintaining, and troubleshooting customer specific Security Information & Events Management (SIEM) environments. The position is Federally focused and the engineer will need to operate within a Department of Defense/Classified environment. Key Areas of Responsibility - Provides Professional Services / Splunk architecture: Responsible for design, implementation, and troubleshooting the customer's SIEMSplunk environment. - Expertise developing security-focused content for Splunk, including creation of complex threat detection logic and operational dashboards. - Regularly interface with technical and business staff of customers, including the project sponsor and stakeholders of projects in more complex engagements. - Serve as technical point of contact on customer engagements. - Manage time and expense to meet or exceed expectations defined in the Statement of Work. - Provide high quality content deliverables using the appropriate document templates. - Ensure solution is implemented as designed to the customer's satisfaction and approval. - Educate the customer on solution as appropriate throughout the life of the project or service life. - Conduct throughput analysis, problem solving, and infrastructure planning. - Participate in various client projects intended to continually improve/upgrade SEIM environments. - Assist in the management of projects using CDWG's project management methodology. - Work with Professional Services Managers, OEMs, Project Managers and customers to manage expectations and timelines to ensure expectations and commitments are being met. - Educates the customer on solution as appropriate throughout the life of the project or service life. The information in this position description is intended to convey information about the key responsibilities and requirements of the position. It is not an exhaustive list of the skills, efforts, duties, responsibilities or working conditions associated with the opportunity. Responsibilities are subject to change. Qualifications Minimum Qualifications - Bachelor degree or equivalent years of military service - 7 years of delivering complex technical solutions including planning, development, implementation and support including Failover techniques, recovery/rollback and application partitioning - 7 years of Federal and/or DoD experience - 5 years of current experience in technical consulting or big data analytics - 5 years of Cyber Security experience (security analytics, SOC experience) - 3 years Splunk engineering experience - DoD 8570 level II IAT Certification - DoD/OPM Secret Clearance (S) - Splunk Certified Admin, Splunk Certified Architect, Splunk Certified Consultant Other Required Qualifications - Proficiency with Splunk App/TA configuration - Management/deployment experience with large scale/distributed Splunk environments - Proficiency developing log ingestion and aggregation strategies - Familiarity with key security events on common IT platforms - Deep proficiency in client and server operating systems including Windows, Mac, and Linux - General networking and security troubleshooting (firewalls, routing, NAT, etc.) - Scripting and development skills (BASH, Perl, Python or Java) with strong knowledge of regular expressions - Experience with implementing Defense Information Systems Agency (DISA) Secure Technical Implementation Guidelines (STIGs) within a Department of Defense environment. - Experience with remediating identified Information Assurance Vulnerability Alerts (IAVAs) within DoD systems - Willing to travel (50%) - Strong organizational skills. - Excellent attention to details. - Abilities to work independently and to manage time effectively. - Effective communication skills with an appreciation for the appropriate ways to interact with managers, coworkers, customers and vendors. Preferred Qualifications - Common Information Model (CIM) validation - Universal/Heavy Forwarder configuration experience (Including encryption and compression setting - Deep experience with Splunk Enterprise Security CUSTOMER FOCUS AND QUALITY MANAGEMENT RESPONSIBILITIES: Each CDW coworker is responsible for maintaining customer focus and conforms to the CDW quality management system. Specific responsibilities include: Job Category: Delivery EngineeringJob Type: Full-TimeTravel Percentage: 50%Share:

          

HPC Datacenter Intern (Systems Administrator)   

Cache   

Job Description We're looking for a candidate with passion for working on Intel's latest technology and you will have an opportunity to be the first on alpha and beta hardware long before they are available to the public. The responsibilities will include, but are not limited to the following: - Supporting a variety of compute server hardware and software for high performance, consistency and availability. - Will be heavily involved in the frequent HW/SW installs, upgrades, troubleshooting and support with the goal of enhancing the performance, reliability and manageability of the clusters as they are used for benchmarking and software development. - Will also be responsible for protecting Intel IP, by utilizing Intel's system of record for tracking early hardware. This will be an exciting opportunity to support the latest Intel HPC data center technologies, including servers, fabric, and storage. HPC Frontier Labs / CRT-DC in Rio Rancho, New Mexico runs Intel's High Performance Computing benchmarking cluster Endeavour currently rates in HPC Top 500 and Green 500. Other Clusters include Cloud, Big Data, Network Functions Virtualization, and other technology clusters. The ideal candidate should exhibit the following behavioral traits: - Self driven attitude - Team player - Effective communication skills - Troubleshooting skills Qualifications You must possess the below minimum qualifications to be initially considered for this position. Experience listed below would be obtained through a combination of your school work/classes/research and/or relevant previous job and/or internship experiences. Minimum Requirements: Must be pursuing a bachelor degree in Computer Science, Computing Engineering, Electrical Engineering, or related field Minimum of 3 months experience with: - Programming in 1 or more of the following languages (C, Fortran, Perl, Python or Bash) - Linux or UNIX operating system Candidate must be willing to lift up to 35 lbs. 2 times per week. Availability for 12 months internship. This U.S. position is open to U.S. Workers Only . A U.S. Worker is someone who is either a U.S. Citizen, U.S. National, U.S. Lawful Permanent Resident, or a person granted Refugee or Asylum status by the U.S. Government. Intel will not sponsor a foreign national for this position. Preferred Qualifications Experience installing and managing the Linux operating system on a server. Experience administering a Linux server for multiple users. Understanding of the technical concepts, architecture, systems, development methods, and disciplines associated with the defined program, and utilizes knowledge to accelerate project completion. Experience with MPI libraries, preferably Intel MPI. Experience with IB Networks, preferably Omni-Path. Experience writing HPC application. Experience administering Lustre. Inside this Business Group Intel Architecture, Graphics, and Software (IAGS) brings Intel's technical strategy to life. We have embraced the new reality of competing at a product and solution levelnot just a transistor one. We take pride in reshaping the status quo and thinking exponentially to achieve what's never been done before. We've also built a culture of continuous learning and persistent leadership that provides opportunities to practice until perfection and filter ambitious ideas into execution. Posting Statement All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance....

          

Искусственный интеллект и Big data: 14 ноября пройдет Международный форум по цифровизации энергетики   

Cache   

Центральными темами станут «умные» электросети, искусственный интеллект и большие данные, кибербезопасность и кадры для цифровой энергетики. Также пройдёт питч-сессия энергетических стартапов.

          

Marketing Manager - Renton, WA or Dallas, TX, WA or TX   

Cache   

Company Overview

Converge360 Events, a division of 1105 Media keeps managers and professionals current with the information, insight and analysis they need to succeed in development, IT and the channel. Our content platforms include print, digital, online, events and a broad spectrum of marketing services.

**This position is available in Renton, WA OR Dallas, TX**

Job Summary

The Converge360 Events team is currently seeking a Marketing Manager who will be responsible for the success of multiple (15+ yearly) conferences, events, training seminars, and tradeshows through effective attendee acquisition. They help develop and manage the strategy, budget, and implementation of multiple marketing plans and analyze the achievement on ROI.
 
Job Responsibilities
 
You are responsible to achieve 3 primary performance objectives:
 
1)  Create, implement, and execute marketing strategy for 1105 Media, Inc.’s Converge360 Event Brands, including Visual Studio Live!, TechMentor, and Live! 360, among others
  • Plan and execute extensive multi-touch marketing campaigns, to include email (both outbound database marketing and more strategic inbound/automated methods), brochures, banner ads, website development, print advertising, social media, Google AdWords, direct mail, telemarketing, content marketing, and other resources to acquire in-person and online event attendees.
  • Work closely with the Event VP, Art, Online Media, Sales, and other Converge360 staff to develop event themes, designs, and overall vision.
 
2) Execute superior marketing copywriting and design for a variety of marketing collateral, print, and online campaigns
  • Demonstrate expertise in best practices for subject lines, headlines, technical marketing copy, and email design.
  • Translate event goals into compelling marketing materials (websites, emails, print ads, web ads, etc.).
  • Maintain clear and concise messaging frameworks for each event brand promotion.
  • Manage the design and production of all collateral, ensuring consistent branding and messaging across all marketing materials.
 
3)  Manage Marketing Personnel, Budgets, and Projects
  • Manage, lead, and motivate the Marketing Coordinator.
  • Manage relationships with outside vendors and contractors.
  • Track and monitor the response of all executed efforts; making contingency plans where necessary to meet attendee target goals.
  • Create and deliver ongoing reports on attendee numbers and associated revenue, campaign results, and ROI.
  • Employ data analytics and data mining techniques to effectively target conference attendees.
  • Manage overall marketing expense budget and work with finance in tracking and monitoring attendee revenue.
 
Qualifications:
 
Required
  • Bachelor’s Degree
  • 5-8 years progressive marketing experience
  • Superior marketing copywriting skills
  • Experience with marketing automation programs and/or inbound marketing programs (Marketo, Omeda, etc.)
  • Experience in HTML
  • Superior Excel proficiency 
  • Experience with leveraging direct marketing tactics
  • Experience in managing email campaigns and multi-touch campaigns
  • Experience working with art/graphic and web/online teams
  • Experience developing content marketing (blogs, whitepapers, etc.)
  • Experience in social media campaign management and execution
  • Experience with running focus groups
  • Knowledge of SEM & SEO marketing techniques
  • Experience with Google AdWords and Google Analytics
  • Must be able to set priorities and juggle multiple concurrent projects
  • A strong work ethic and the capacity to succeed in a fast-paced environment
We offer a competitive salary and a comprehensive benefits package that includes medical/dental/vision insurance, life insurance, disability insurance, 401(k) plan and a generous paid time off (PTO) / holiday plan.

We are proud to be an Equal Opportunity Employer. 

Corporate Profile:
1105 Media., Inc., is the leading provider of B2B media services in the ABM, big data, education, technology, enterprise computing, government technology and infrastructure markets. Our products focus on technology, policy, regulation and news delivered through channels including print and online magazines, journals, and newsletters, seminars, conferences, executive summits, and trade shows, training and courseware and web-based services. 1105 Media is based in Woodland Hills, CA with offices throughout the United States.

 

          

Software Developer   

Cache   

Recruiter Bechtel Location Reston, Virginia Salary Competitive Posted 02 Nov 2019 Closes 02 Dec 2019 Ref ********* Sector Oil and Gas Category Information Technology Contract Type Permanent Hours Full Time You need to sign in or create an account to save Requisition ID: ****** Bechtel is seeking a talented, energetic, ambitious Software Developer who wants to join the software development team in Reston, VA. As a Developer you will have the opportunity to be a member of our global software development organization working with state of the art tools and technologies to build the next wave of software applications for our global enterprise. You'll get the chance to interface with customers, create new products and web and mobile applications, and improve on existing systems and code. The products that you build will touch thousands of users across the globe. Support the development of Cost Management System (EcoSys) within Project Controls and facilitate enterprise-wide initiatives as they relate to the Project Controls systems. Support functional team in analyzing complex tasks and providing solutions to address the demanding challenges requiring initiative and independent System Analyst judgment. Demonstrates a systematic, disciplined and analytical approach to problem solving. This position will mainly focus on system configuration for EcoSys. Plans and conducts independent work requiring judgment in adaptation of information systems techniques, procedures and criteria in areas including application support, software engineering, software quality assurance and quality control. Responsibilities Server-side software development expertise to design, develop and implement horizontally scalable components with the ability to handle large transaction and query volumes with experience in various technologies system development experience in EcoSys configuration Drive technical excellence and implementation of best data engineering practices Develop full stack applications that are scalable, robust, and thoughtfully designed Design database schemas Take the applicaiton through full software development lifecycle, from design to implementation Ship production code to a large customer base, and will take full ownership of the work, including testing and deployments Collaborate with other software engineers, domain experts, and end-users, to build the right solutions that addresses the business needs Oversee the design, scoping, implementation, and testing in short agile release cycles Interface with Software Quality Assurance and deployment teams Identify and implement technical innovations to improve work processes Work closely with the lead developer(s) and users to assure that business requirements are accurately represented in design and implementation phases Learn Project Controls work processes and apply the knowledge in system development Basic Qualifications Bachelor's degree in Computer Science, Computer Engineering, CIS, MIS or related field of study 7 years of strong software design and development experience, building and administering large-scale distributed applications 5 years of Oracle, SQL Server, C#, *******, API Development and Javascript experience Preferred Qualifications & Skills: Ability to develop in Power BI, data flow and MS Excel Knowledge of data models, database design and related technologies Knowledge of Power BI/Data Flow Development Knowledge of Azure and Big Data eco-system Experience in any of following: Master, Reference and Metadata Management, Data Quality, Integration, Interoperability Query Languages for Property-Graph or Knowledge-Graph, NoSQL, and Relational databases e.g. Gremlin, Cypher, SQL, GraphQL Data models for Property-Graph databases e.g. Cosmos DB, Neo4j CI/CD experience with Git, TFS, Azure DevOps and/or containers/ Experience with large data volume will be a plus, with a high transaction rate on one or more of the technologies such as API gateway Shaping tomorrow together Bechtel is one of the most respected global engineering, construction, and project management companies. Together with our customers, we deliver landmark projects that foster long-term progress and economic growth. Since 1898, we've completed more than 25,000 extraordinary projects across 160 countries on all seven continents. We operate through four global businesses: Infrastructure; Nuclear, Security & Environmental; Oil, Gas & Chemicals; and Mining & Metals. Our company and our culture are built on more than a century of leadership and a relentless adherence to our values, the core of which are safety, quality, ethics, and integrity. These values are what we believe, what we expect, what we deliver, and what we live. *************** Bechtel is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability. Sign up for job alertsGet new jobs for this search by emailEmail jobs

          

.NET Developer   

Cache   

BI Business Analyst - Louisville KY Note: This is a full-time, direct-hire, W2. No C2C, 3rd party or H-1B at this time. We appreciate all of our applicants, but we are not able to sponsor at this time. What we are looking for: Top 3 Must Have: 1. .net core/all c#, not opposed to someone who has node or java would not turn candidates away, want exposure to .net, as a whole not a .net team (node/java/or .net) WEB API 2. Highly focused on team work, gets along good with everyone, team culture, can't be afraid to ask for help etc. 3. AWS/Azure experience 4. Trying to be more independent, take a task of what's assigned, etc, but know when to ask for help. Nice to Have: 1. Microservice architecture 2. .net core experience 3. Big Data (elastic, mongoDB, things that aren't relational) 4. NOSQL Send resumes to David Chiles, davc011@kellyservices.com .Why Kelly -- ?With Kelly, you'll have direct connections to leading IT organizations in the best companies around the globe-offering you the chance to work on some of today's most intriguing, innovative and high-visibility projects. In a field where change is the only constant, our connections and opportunities will help you take your career exactly where you want to go. We work with 90 of the Fortune 100--- companies companies and found opportunities for more than 8,600 IT professionals last year. Let us help advance your career today.About Kelly -- At Kelly, we're always thinking about what's next and advising job seekers on new ways of working to reach their full potential. In fact, we're a leading advocate for temporary/nontraditional workstyles, because we believe they allow flexibility and tremendous growth opportunities that enable a better way to work and live. Connecting great people with great companies is what we do best, and our employment opportunities span a wide variety of workstyles, skill levels, and industries around the world. Kelly is an equal opportunity employer committed to employing a diverse workforce, including, but not limited to, minorities, females, individuals with disabilities, protected veterans, sexual orientation, gender identity. Equal Employment Opportunity is The Law. - provided by Dice

          

Apps Systems Engineer 5   

Cache   

SunIRef:Manu:title Apps Systems Engineer 5 Wells Fargo 35,973 reviews - St. Louis, MO Wells Fargo 35,973 reviews Read what people are saying about working here. Overview: Wells Fargo technology teams drive innovation to create a more powerful and fulfilling financial experience for our customers and team members. You will join more than 24,000 team members supporting 95 billion transactions annually in 10 countries. Our career opportunities span the technology spectrum: advanced analytics, big data, information security, application development, cloud enablement, project management and more. SUCCESS PROFILE Check out the top traits we're looking for and see if you have the right mix. Additional related traits listed below. Analytical Detail-oriented Insightful Inventive Problem Solver Curious Benefits Wells Fargo wants to help you get more out of life and take care of things outside the office to make life a little easier. We provide: Medical, Dental and Vision Employer Matching 401(k) Tuition Reimbursment Maternity and Paternity Leave Paid Time Off Responsibilties Job Description Important Note: During the application process, ensure your contact information (email and phone number) is up to date and upload your current resume prior to submitting your application for consideration. To participate in some selection activities you will need to respond to an invitation. The invitation can be sent by both email and text message. In order to receive text message invitations, your profile must include a mobile phone number designated as Personal Cell or Cellular in the contact information of your application. At Wells Fargo, we want to satisfy our customers' financial needs and help them succeed financially. We're looking for talented people who will put our customers at the center of everything we do. Join our diverse and inclusive team where you'll feel valued and inspired to contribute your unique skills and experience. Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you. Wells Fargo Technology sets IT strategy; enhances the design, development, and operations of our systems; optimizes the Wells Fargo infrastructure footprint; provides information security; and enables continuous banking access through in-store, online, ATM, and other channels to Wells Fargo's more than 70 million global customers. The Data Management team supports and maintains the Wells Fargo Trust business core data repositories. The following critical skills for supporting the core repositories are desired: Acts as a senior level COBOL developer Provide mainframe application support. Participate in weekly on-call rotation to support batch cycle. Acts as an Informatica/ETL developer in application development and supporting datamarts Candidate will work closely with application lead to understand business requirements, write functional design specs and participate in efficient development delivery and deployment Evaluate system designs for efficiency with focus on performance. Verifies program logic by preparation of test data, testing and debugging of programs Develops documentation as per compliance and support needs Participates as needed in compliance, risk and regulatory processes within Wells Fargo Interact/Collaborate closely with teammates, DBAs, Line of Business partners and other technical teams Work with offshore development teams Coordinate and lead junior developers Support ad-hoc report requests for reporting analysis and data issue investigation Acts in the highest level technical role as an individual contributor and/or team lead for the most complex computer applications and/or application initiatives. Utilizes a thorough understanding of available technology, tools, and existing designs. Works on the most complex problems where analysis of situations or data requires evaluation of intangible variance factors. Plans, performs, and acts as the escalation point for the most complex platform designs, coding, and testing. Leads most complex multiple modeling, simulations, and analysis efforts. Acts as expert technical resource to programming staff in the program development, testing, and implementation process Required Qualifications 7+ years of application development and implementation experience 7+ years of application development experience 7+ years of Cobol experience 7+ years of mainframe experience 7+ years of Informatica experience 7+ years of data warehouse experience 6+ years of experience with databases such as Oracle, DB2, SQL server, or Teradata Desired Qualifications Good verbal, written, and interpersonal communication skills 2+ years of experience with end-to-end design and delivery of ETL (Extract, Transform, Load) applications 2+ years of data modeling experience Other Desired Qualifications Experience with Mainframe suite of tools - COBOL, JCL, CA7, Endevor, TSO, Xpediter, etc. Ability to provide leadership to a virtual or remote team Experience with end-to-end design and delivery of data warehouse applications Experience in a technical lead role guiding less senior development resources Experience in, or exposure to Agile development methodologies Experience with Business Continuity Planning and execution Familiarity with various Database management systems (Oracle, DB2, Sybase, Teradata, SQL Server) Experience with Exadata Experience with Hadoop/BigData SQL Query tuning and optimization experience Experience with various data transfer packages (FTP, SFTP, NDM, NDM Secure +) Scripting language experience a plus (csh, ksh, perl, ect.) Familiarity with automated deployment tools (GitHub, Jenkins, UDeploy) Disclaimer All offers for employment with Wells Fargo are contingent upon the candidate having successfully completed a criminal background check. Wells Fargo will consider qualified candidates with criminal histories in a manner consistent with the requirements of applicable local, state and Federal law, including Section 19 of the Federal Deposit Insurance Act. Relevant military experience is considered for veterans and transitioning service men and women. Wells Fargo is an Affirmative Action and Equal Opportunity Employer, Minority/Female/Disabled/Veteran/Gender Identity/Sexual Orientation. TECHNOLOGY Wells Fargo - Today report job - original job

          

Public Sector Hardware Sales Representative    

Cache   

This position is responsible for new account development and/or expanding existing accounts within an established geographic territory. Works as part of an account team to identify, qualify and deliver Hardware products/ solutions. Responsible for the account plan to drive goal attainment in assigned territory. Coordinates with the other members of the sales team (employees and partners) to support account sales and business development strategies. Helps identify and engage the appropriate partner to meet customer specifications. Becomes trusted advisor to key customer influencers and decision makers. Drives company's strategy into assigned accounts. Follows all companies' methodologies and processes related to sales opportunity pursuit. Ensures that the company's sales programs are known and executed in assigned territory, including personal follow-up and engagement in selected opportunities. Achieves or exceeds the quarterly and annual sales goals. May travel frequently. Duties and tasks are standard with some variation. Completes own role largely independently within defined policies and procedures. 2 years relevant experience and BA/BS degree preferred. Oracle is an Affirmative Action-Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability, protected veterans status, age, or any other characteristic protected by law. Public Sector Hardware Sales Representative - North East USA - NY, NJ, CT, RI, MA, ME, VT, NH - Hardware Sales Rep II Oracle s mission-focused SLG Infrastructure Sales team seeks a professional with a techno-savvy spirit and an ability to bring ideas, concepts, and people together to deliver cost-effective and technically-sound hardware solutions to Public Sector customers in the Pacific Northwest and Alaska. We are searching for a results-oriented hardware field sales representative with experience selling into State and Local Government agencies in the capital. Top candidates possess a background selling both on-premise and cloud-based public sector hardware systems. The Product Portfolio consists of Engineered Systems (Exadata, SuperCluster, & Big Data Appliance), Enterprise-class Private/Public Cloud solutions, x86/SPARC servers, and storage systems. Proficiency in the areas of Enterprise Architecture, OS, Database, Application, HPC, Storage, Virtualization, IaaS, and Cloud is expected. The successful representative will sell effective solutions to the C-level executives within these agencies, generating net-new business units within these accounts. The representative must leverage inside sales teams, channel sales teams, business development teams, resellers, and marketing campaigns to generate demand and create pipeline within the territory. The representative will have the authority to create and execute a personal territory strategic plan and must manage the sales cycle from opportunity discovery to closure. Experience with the Value Selling methodology an advantage. Candidates should be skilled in compliance-based issues, budgets, contracts, regulations, and procurement cycles. Experience in Complex Deal/Capture Management, Finance, Pricing, and Contracting is a plus. State agency account sales experience is necessary, preferably in the North East region. Hardware solution sales experience is preferred. The successful candidate will have met or exceeded their quota history. Preferred Geographic Location - North East USA - NY, NJ, CT, RI, MA, ME, VT, NH Travel schedule at representative s discretion. This is a Hunter and individual contributor role. Resume to ******************** Job: *Sales Organization: *Oracle Title: Public Sector Hardware Sales Representative - North East USA - NY, NJ, CT, RI, MA, ME, VT, NH - Hardware Sales Rep II Location: United States Requisition ID: 19001COT

          

Big Data Engineer - colossal systems - Richmond, QC   

Cache   

8+ Years of IT experience including strong 4+ years’ experience in *Big Data*. 3+ years’ experience in *Spark*. Platform and proficient in coding in *Scala*.
From Indeed - Thu, 31 Oct 2019 15:19:07 GMT - View all Richmond, QC jobs

          

CFO: binnen twee jaar is de financiële wereld volledig afhankelijk van big data    

Cache   

none

          

Modern Data Engineer   

Cache   

About UsInterested in working for a human-centered technology company that prides itself on using modern tools and technologies? Want to be surrounded by intensely curious and innovative thinkers?

Seeking to solve complex technical challenges by building products that work for people, meet and exceed the needs of businesses, and work elegantly and efficiently?

Modeling ourselves after the 1904 World's Fair, which brought innovation to the region, 1904labs is seeking top technical talent in St. Louis to bring innovation and creativity to our clients.

Our clients consist of Fortune 500 and Global 2000 companies headquartered here in St. Louis. We partner with them on complex projects that range from reimagining and refactoring their existing applications, to helping to envision and build new applications or data streams to operationalize their existing data. Working in a team-based labs model, using our own flavor of #HCDAgile, we strive to work at the cutting edge of technology's capabilities while solving problems for our clients and their users.The RoleAs a Modern Data Engineer you would be responsible for developing and deploying cutting edge distributed data solutions. Our engineers have a passion for open source technologies, strive to build cloud first applications, and are motivated by our desire to transform businesses into data driven enterprises. This team will focus on working with platforms such as Hadoop, Spark, Hive, Kafka, Elasticsearch, SQL and NoSQL/Graph databases as well as cloud-based data services.

Our teams at 1904labs are Agile, and we work in a highly collaborative environment. You would be a productive member of a fast paced group and have an opportunity to solve some very complex data problems.Requirements3+ years of progressive experience as a Data Engineer, BI Developer, Application Developer or related occupation.


  • Agile: Experience working in an agile team oriented environment
  • Attitude / Aptitude: A passion for everything data with a desire to be at the cutting edge of technology and consistently deliver working software while always keeping an eye on opportunities for innovation.
  • Technical Skills (You have experience with 2 or more of these bulletpoints):






      • Programming in Java (Or similar JVM language such as Scala, Groovy, etc) and/or Python
      • Architecting and integrating big data pipelines
      • Working with large data volumes; this includes processing, transforming and transporting large scale data using technologies such as: MR/TEZ, Hive SQL, Spark, etc.
      • Have a strong background in SQL / Data Warehousing (dimensional modeling)
      • Have a strong background working with and/or implementing architecture for RDBMS such as: Oracle, MySQL, Postgres and/or SQLServer.

      • Experience with traditional ETL tools such as SSIS, Informatica, Pentaho, Talend, etc.
      • Experience with NoSQL/Graph Data Modeling and are actively using Cassandra, HBase, DynamoDB, Neo4J, Titan, or DataStax Graph
      • Installing/configuring a distributed computing/storage platform, such as Apache Hadoop, Amazon EMR, Apache Spark, Apache Hive, and/or Presto
      • Working with one or more streaming platforms, such as Apache Kafka, Spark Streaming, Storm, or AWS Kinesis
      • Working knowledge of the Linux command line and shell scripting







        Desired Skills



        • Analytics: Have working knowledge of analytics/reporting tools such as Tableau, Spotfire, Qlikview, etc.
        • Open Source: Are working with open source tools now and have a background in contributing to open source projects.


          Perks




          • Standard Benefits Program (medical, dental, life insurance, 401(k), professional development and education assistance, PTO).
          • Innovation Hours - Ten percent (10%) of our work week is set aside to work on our own product ideas in a highly collaborative and supportive environment. The best part: The IP remains your own. We are a high-growth culture and we know that when we help people focus on personal and professional growth, collectively, we can achieve great things.
          • Dress Code - we don't have one


            This job is located in St. Louis, MO. While we would prefer local candidates your current location is not the most important factor; please help us understand why you would like to call St. Louis home if you would be relocating.

          

ETL Developer with Security Clearance   

Cache   

Senior ETL Developer
Location: Chantilly
Clearance Level: TS/SCI with FSP Summary
The successful candidate will leverage their development skills and experience, as part of our Sponsor's Data Layer Engineering Team, to support the successful ingestion, cleansing, transformation, loading, and display of significant amounts of data. Duties, Tasks & Responsibilities --- Designing and implementing large-scale ingest systems in a Big Data environment --- Optimizing all stages of the data lifecycle, from initial planning, to ingest, through final display and beyond --- Designing and implementing data extraction, cleansing, transformation, loading, and replication/distribution --- Developing custom solutions/code to ingest and exploit new and existing data sources --- Working with Sponsor development teams to improve application performance --- Organizing and maintaining Data Layer documentation, so others are able to understand and use it --- Collaborating with teammates, other service providers, vendors, and users to develop new and more efficient methods Required Experience, Skills, & Technologies --- Bachelor's Degree in Computer Science, Information Systems, Engineering, or other related discipline, OR 10 years of related software development experience may be substituted for education --- At least 5 years of data analysis and parsing experience --- At least 5 years of Java development experience coupled with significant SQL/database experience --- Strong experience with the full data lifecycle, from ingest through display, in a Big Data environment --- Strong experience with Java-related technologies, such as JDK, J2EE, EJB, JDBC, and/or Spring, and experience with RESTful APIs --- Experience with Hadoop, Hbase, MapReduce --- Experience with Kafka and Zookeeper --- Experience developing and performing ETL tasks in a Linux environment Desired Experience, Skills & Technologies --- Experience with Elasticsearch --- Experience with Gradle

          

Software Engineering Manager   

Cache   

THE CHALLENGEEventbrite's business continues to grow and scale rapidly, powering millions of events. Event creators and event goers need new tools and technologies that empower them to create/have some of the most memorable of life's moments, live experiences. One of our most important elements in achieving our company goals is our people. As an engineering manager you're responsible for the careers, productivity, and quality (among other things) of Eventbrite's builders. THE TEAMWe're a people-focused Engineering organization: the women and men on our team value working together in small teams to solve big problems, supporting an active culture of mentorship and inclusion, and pushing themselves to learn new things daily. Pair programming, weekly demos, tech talks, and quarterly hackathons are at the core of how we've built our team and product. We believe in engaging with the community, regularly hosting free events with some of the top technical speakers, and actively contributing to open source software (check out Britecharts as an example!). Our technology spans across web, mobile, API, big data, machine learning, search, physical point of sale, and scanning systems. This role is based in Eventbrite's Nashville office. We're one of 5 Eventbrite engineering offices around the world. For a little taste of what the team is like and how Eventbrite's Nashville office hashttp://bit.ly/NashEngTHE ROLEWe're looking for a people-focused manager to help support the career growth of our engineers and collaborate on improvement within our organization.THE SKILL SET





    • Demonstrated experience in recruiting a well-rounded, diverse technical team
    • You have a strong technical background and can contribute to design and architectural discussions - coach first, player second.
    • You support your team in providing context and connecting it with how the team impacts the organization
    • Experience working with a highly collaborative environment, coaching a team who ships code to production often
    • With the help of other engineering managers, you develop a sustainable, healthy work environment which is both encouraging and challenging
    • In a leadership/management position for 2-5 years with demonstrated growth of high-functioning engineering teams



      ABOUT EVENTBRITEEventbrite is a global ticketing and event technology platform, powering millions of live experiences each year. We empower creators of events of all shapes and sizes - from music festivals, experiential yoga, political rallies to gaming competitions -- by providing them the tools and resources they need to seamlessly plan, promote, and produce live experiences around the world. Last year, the team served 795,000 creators hosting nearly 4 million experiences across 170 countries. Meet some of the Britelings that make it happen.
      IS THIS ROLE NOT AN EXACT FIT?Sign up to keep in touch and we'll let you know when we have new positions on our team.

      Eventbrite is a proud equal opportunity/affirmative action employer supporting workforce diversity. We do not discriminate based upon race, ethnicity, ancestry, citizenship status, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), marital status, registered domestic partner status, caregiver status, sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, genetic information, military or veteran status, mental or physical disability, political affiliation, status as a victim of domestic violence, assault or stalking, or other applicable legally protected characteristics. Applicant Privacy Notice

          

Asacol | Price Oregon - by: panacheindonesian   

Cache   

Looking for a asacol? Not a problem!

Guaranteed Worldwide Shipping

Discreet Package

Low Prices

24/7/365 Customer Support

100% Satisfaction Guaranteed.

Visit This Website...









































































Tags:
asacol skutki uboczne order asacol
asacol amex purchase paypal
cost asacol lng iud
cheap asacol ach tabs
pharmacy asacol drug find delivery
order asacol norfolk
asacol delivery no script tabs
buy brand asacol online greece
purchase asacol idaho
order asacol online review
discount asacol ach delivery
cheapest asacol discount mastercard legally
asacol buy online usa
canada buy asacol
discounted asacol pills fast delivery
asacol ups cod purchase
buy asacol 400mg wire transfer
asacol overnight delivery saturday delivery
need rebate asacol money order
asacol buy without a prescription
cheapest asacol discounts online
asacol overnight no script prior
buy asacol no prescription purchase
asacol ulcerative colitis cost fast
purchase asacol cheap find
price asacol pill otc
cost asacol discount fedex
asacol 30 mg no prescription
price asacol oregon
overnight asacol cheap saturday
asacol no script internet
canadian asacol price
discount asacol cheap no prescription
ipocal asacol cod accepted
order asacol annapolis
buy asacol senza
cheap asacol in uk online
purchase asacol with visa e5gzp
asacol canadian pharmacy online visa
buy asacol continus tabs
order asacol online in
asacol without script cost pills
no script salofalk asacol
discount otc asacol in belfast
i want asacol cheapest discount
asacol overnight cod no prescriptionu
cheap asacol side
to buy asacol mesalamine pharmaceutical
purchase asacol maryland
cheapest asacol internet western union
buy asacol shipped ups x787o
cheap asacol illinois
buy cod asacol legally mastercard
generic asacol discount free shipping
asacol cough buy
cheap pharmacy asacol
pharmacy asacol legally saturday delivery
order asacol baton rouge
For starters, this system will likely include courses on community engagement and active citizenship. In order to pursue a piece of paper as a pharmacy technician and pass your exams you could possibly address your inquiry to the Pharmacy Technician Certification Board (PTCB) and also the Institute of Certification of Pharmacy Technicians (ICPT). The big databases ensure that you get different questions almost every other time you undertake the sample pharmacy technician exam.

Have you checked to view if you are eligible for public health insurance programs. Therefore, online transactions regarding the medicine becomes the order in the day which can be why they feel an immediate need to have a pharmacy credit card merchant account. Pharmacy technicians and pharmacists, primarily in large retail or hospital pharmacies, tend not to have control over the copay.

' Ram Eesh Institute of Vocational and Technical Education,. There are extremely a number of good pharmacy tech schools, each for the internet and at campuses, which will give you the instruction you are going to need in a short amount of time, typically from four months to 2 many years. Also, learning various medical names, actions of medications, uses, and doses.

The pharmaceutical industry especially can't afford to allow its practitioners to delay or refuse management of clients due towards the perceived usage of any medical device, maybe it's a pill or perhaps a syringe. To find degree programs that train you for any career as a pharmacy technician, you can check out the Directory of Schools website at directoryofschools. However, by exercising with adequate practice pharmacy technician test questions we have better odds of being successful.

There are various numbers of online degrees available including associate's, bachelor's as well as doctorate programs. I feel this aspect is just about the toughest part with the residency curriculum. Several brands of natural cleaners might be found essentially large stores, including Bio - Kleen and Seventh Generation.

          

Software Engineering - Mid Career   

Cache   

Description:Do you have a commitment to success, proven interpersonal skills, a positive outlook? Are you a creative thinker and problem solver? Are you looking for a meaningful career?

If so, then we would love to speak with you! The Engineering & Technology organization is a highly dynamic and growing organization that plays a vital role in providing mission-critical intelligence solutions by optimizing satellite constellations, processing data through its entire life cycle and developing reliable cutting-edge infrastructure for information dissemination across a global network and to a variety of programs.

You'll have an opportunity to work on world-class programs such as Orion, SBIRS, GPS-III, Commercial Vehicles, Deep Space Exploration, Classified Special Programs.

Within these Programs, you will participate in the design, development, verification and deployment of complex new software. You will utilize your previous software experience, and/or skills, to develop and maintain software using the following languages and technologies; C++, Java, Python, Big Data, Data Science, Cyber Security, and Embedded systems

This position is in support of the Air Forces Next Generation (Next Gen) Overhead Persistent Infrared (OPIR) Program. The qualified candidate will be a member of the SEIT Database Software Tools team. Responsibilities include performing software tools design, development, unit test, software integration, test data generation, test plan documentation and formal qualification testing of software tools in support of the data transformation and validation of database inputs. Responsibilities also include database tools and product development, integration, test, validation, and database deployment into the operational system in addition to supporting Program data requirements and products for System Test. We are seeking someone who can communicate effectively with the team to work through road blocks and resolve issues quickly. You will be responsible for the production of Command and Telemetry Databases and multiple supporting products required for database deployment into the operational system.

Applicants selected will be subject to a government security investigation and must meet eligibility requirements for access to classified information.

Typically has 5 - 10 years of professional experience.

Basic Qualifications:

--- Bachelors degree from an accredited college in a related discipline, or equivalent experience/combined education, with 5 years of professional experience; or 3 years of experience required with a related Masters degree.

--- Strong programming skills to develop maintainable code, test data, perform thorough unit, integration & regression testing

--- Experience with one or more of the following programming languages: Java, C, C++, C#, SQL, Perl, Node JS or Python

--- Experience with one or more of the following database products - Microsoft SQL Server, SAP ASE, MySQL, PostgreSQL, DB2, or Oracle

--- Ability to obtain DoD Secret clearance.

Desired Skills:

--- Bachelors degree with 5 years of experience working in Flight Software and/or Database Development for the SBIRS and/or AEHF programs

--- Experience in Software Engineering, Systems Engineering, Technical and Management Process, Technical Writing and Software Testing

--- Experience in Agile software development

--- Experience developing data translation software

--- Software Qualification Testing experience with developing unit and functional tests preferred. SBIRS Database tools software experience would be a plus.

--- Creating processes, requirements and support documentation

--- Familiarity with spacecraft subsystems, hardware, telemetry, command & control.

--- Experience with following SW tools: Transformation Tools Implementation, STSS, ClearCase & ClearQuest, Git or SVN

--- Advanced skills in any of the following programming languages: JAVA, C, C++, C#, SQL, Perl, or Python, Node.js, .NET, ClearCase, Git, SVN or equivalent SW CM Tool

--- Advanced skills with one or more of the following database products - Microsoft SQL Server, SAP ASE, MySQL, PostgreSQL, DB2, or Oracle

--- Strong communication and interpersonal skills.

--- Familiarity with SBIRS STSS Test Systems, JTRANS DB Development and SBIRS Common Database Schema

--- Current DoD Secret clearance. TS/SCI preferred

To promote the sharing of ideas, Lockheed Martin fosters an inclusive work environment that encourages differences and big-picture thinking. Our employees play an active role in strengthening the quality of life where we live and work by volunteering more than 850,000 hours annually. Here are some of the benefits you can enjoy:

--- Medical

--- Dental

--- 401k

--- Paid time off

--- Work/life balance

--- Career development

--- Mentorship opportunities

--- Rewards & recognition

Learn more about Lockheed Martins competitive and comprehensive benefits package.

BASIC QUALIFICATIONS:

job.Qualifications

Lockheed Martin is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

Join us at Lockheed Martin, where your mission is ours. Our customers tackle the hardest missions. Those that demand extraordinary amounts of courage, resilience and precision. Theyre dangerous. Critical. Sometimes they even provide an opportunity to change the world and save lives. Those are the missions we care about.

As a leading technology innovation company, Lockheed Martins vast team works with partners around the world to bring proven performance to our customers toughest challenges. Lockheed Martin has employees based in many states throughout the U.S., and Internationally, with business locations in many nations and territories.

EXPERIENCE LEVEL:

Experienced Professional

          

Software Engineer (Mid-Level) with Security Clearance   

Cache   

Software Engineer (Mid-Level)
Chantilly, VA 20151 Security Clearance: TS/ISSA Aperio Global is hiring a mid-level Software Engineer to to provide support to a federal government program providing full life cycle development for data development, database operations, and data analytics. Responsibilities include: --- Work with a talent team of developers and data scientists exposing non-standard data through APIs and web applications, particularly REST APIs using AJAX.
--- Support Natural Language Processing (NLP) including OCR, information extraction, and indexing.
--- Stretch your technical capabilities to work across as much of the full stack as you are able for cloud-based operations, ETL, database operations, analytics, front end work, and technology evaluation. Requirements include: --- U.S. citizenship --- Current TS clearance and poly (TS/ISSA)
--- Bachelor's degree in Computer Science, Engineering, Information Security, Data Science, or related field. Additional years of experience in lieu of a degree will be considered.
--- Mid-level experience (5+ years) providing development and data services in a government environment.
--- Experience supporting Agile development is required. Experience in a SecDevOps environment is preferred.
--- Full stack development experience or the ability to expand your IT acumen.
--- Experience with ETL, database operations, and data analytics.
--- Experience with SQL database management.
--- Strong Java development experience using v8 or later.
--- Front-end development experience is a plus.
--- Experience developing in an Amazon Web Services (AWS) environment.
--- Hands-on development supporting data through APIs and web application.
--- Prior experience with common services, drop-in UI components, and deep linking is desired.
--- Experience with information extraction such as regex, entities, sentiment, geotags, topic, events, etc.
--- Experience supporting interfaces and working with tools for big data such as R, Python, Hive, or Pig, particularly for Natural Language Processing (NLP) or Machine Learning (ML).
--- Programming experience with web applications in HTML, CSS and JavaScript using Node, React, or Angular (v2 or later).
--- Experience establishing NiFi data flows is highly desired.
--- Prior work with Systems Administration in an AWS environment is desired.
--- Experience with Elasticsearch, Logstash, and Kibana (ELK) or Solr. Aperio Global delivers professional, innovative strategies and technology to integrate information security and artificial intelligence into the Department of Defense, federal and local government agencies, and commercial sector operations. We bring world-class resources and experience to help clients successfully navigate the complex and ever-changing issues in implementing next-generation concepts while effectively discovering and sustaining critical technology. Visit us at www.aperioglobal.com.

          

InterSystems and Inspur Enter Agreement to Innovate Healthcare Big Data Platforms in China   

Cache   

News & Events > News > InterSystems and Inspur Enter Agreement to Innovate Healthcare Big Data Platforms in China BEIJING, China, November 4, 2019 – InterSystems, a global leader in information technology platforms for health, business, and government applications, today...

          

Case Study Analytics and Big Data: Maxcom Telecomunicaciones   

Cache   

Vertica delivers drastically improved performance for regulatory compliance and enables an eighty-five percent reduction in fraud- related costs vs last year. Maxcom Telecomunicaciones is a Mexican integrated telecommunications operator providing managed voice and data...

          

MTS Intern - PhD   

Cache   

Date Posted October 29, 2019 Category Science-Computer Sciences Employment Type Full-time Application Deadline Open until filled Who are our employees? We're an eclectic group of 4,000+ dreamers, believers and builders, operating in over 40 countries. We're Hungry. Humble. Honest. With Heart. The 4H's: these are our core values and the DNA of our company. They help drive our employees to succeed, to strive to be better, to learn from every experience. Our employees are encouraged to have spirited debates and conversations and to think with a founder's mindset. This means we're all CEO's of the company and, as such, make the best decision every day that aligns with our company goals. It's through our values, our conversations and mindsets that we can continue to disrupt the industry and drive innovation in the market. Who are we in the market? Nutanix is a global leader in cloud software and hyperconverged infrastructure solutions, making infrastructure invisible so that IT can focus on the applications and services that power their business. Companies around the world use Nutanix Enterprise Cloud OS software to bring one-click application management and mobility across public, private and distributed edge clouds so they can run any application at any scale with a dramatically lower total cost of ownership. The result is organizations that can rapidly deliver a high-performance IT environment on demand, giving application owners a true cloud-like experience. Learn more about our products at *************** or follow us on Twitter @Nutanix. Nutanix engineers are crafting a groundbreaking technology, building the Nutanix Enterprise Cloud OS. We're using our love of programming and diverse backgrounds to deliver the simplicity and agility of popular public cloud services, but with the security and control that you need in a private cloud. At Nutanix, you'll find no shortage of challenging problems to work on. We work closely with our product in a collegiate, collaborative environment that encourages the open exploration of idea. The Role: MTS Intern The Engineering Summer Internship is an opportunity to gain exposure to one or more Nutanix engineering roles according to your skillset and interests. Some potential roles include (but not limited to) working on the core data path, storage and filesystems development, distributed systems, infrastructure and platform/hardware deployment, data protection and replication, tools and automation, development of a big data processing platform, development of the API and analytics platform, and Web and front-end UI/UX development. Each intern is paired with a Member of Technical Staff who serves as a guide through our engineering culture, toolsets, and development methodology. Our internship program also includes a series of lunch and learns, training events, and social outings to expose you to other aspects of a rapidly growing Silicon Valley technology company. Responsibilities: - Architect, design, and development software for the Nutanix Enterprise Cloud Platform - Develop a deep understanding of complex distributed systems and design innovative solutions for customer requirements - Work alongside development, test, documentation, and product teams to deliver high-quality products in a fast pace environment - Deliver on an internship project over the course of the program. Present the final product to engineering leadership. Requirements: - Love of programming and skilled in one of the following languages: C++, Python, Golang, or HTML/CSS/Javascript - Extensive knowledge or experience with Linux or Windows - Have taken courses or completed research in the areas of operating systems, files systems, big data, machine learning, compilers, algorithms and data structures, or cloud computing - Knowledge of or experience with Hadoop, MapReduce, Cassandra, Zookeeper, or other large scale distributed systems preferred - Interest or experience working with virtualization technologies from VMware, Microsoft (Hyper-V), or Redhat (KVM) preferred - Detailed oriented with strong focus on code and product quality - The passion & ability to learn new things, while never being satisfied with the status quo Qualifications and Experience: - Pursuing a PhD degree in Computer Science or a related engineering field required. - Available to work up to 40 hours per week for 12 weeks over the summer months Nutanix is an equal opportunity employer. The Equal Employment Opportunity Policy is to provide fair and equal employment opportunity for all associates and job applicants regardless of race, color, religion, national origin, gender, sexual orientation, age, marital status, or disability. Nutanix hires and promotes individuals solely on the basis of their qualifications for the job to be filled. Nutanix believes that associates should be provided with a working environment that enables each associate to be productive and to work to the best of his or her ability. We do not condone or tolerate an atmosphere of intimidation or harassment based on race, color, religion, national origin, gender, sexual orientation, age, marital status or disability. We expect and require the cooperation of all associates in maintaining a discrimination and harassment-free atmosphere. Apply *Please mention PhdJobs to employers when

          

Assistant/Associate Professors--Physical Science and Data Science   

Cache   

Job Summary The College of Science at Purdue University invites applications for multiple positions in ---Physical Science and Data Science--- at the Assistant or Associate Professor level beginning August 17, 2020. Assistant Professor candidates with exceptional qualifications may be considered for an early career endowed professorship. This opportunity is coordinated with concurrent searches in ---Computer Science, Mathematics, and Statistics focused on Data Science--- and ---Data Science in the Life Sciences.--- Qualifications These positions come at a time of new leadership and with multiple commitments of significant investment for the College of Science. We particularly encourage candidates who demonstrate the potential for collaboration across multiple disciplines. We expect that most faculty hired through this search will have interdepartmental joint appointments. College of Science Departments hosting research related to Physical Science include: Chemistry, Earth, Atmospheric, and Planetary Sciences, and Physics and Astronomy, as well as Computer Science, Mathematics, and Statistics. Candidates must have a Ph.D. (or its equivalent) in a closely related field. Successful candidates are expected to develop a vigorous, externally funded, internationally recognized theoretical, computational, experimental, and/or observational research program that addresses research questions of fundamental importance. They are also expected to teach undergraduate and/or graduate courses to a diverse student body and supervise graduate students. Successful candidates will combine an outstanding record of research excellence with a commitment to effective and engaged teaching in both physical science and data science. Candidates should have a broad understanding of the numerical and analytic methods in data science, including machine learning, for physical science subject matters, along with the software systems that implement them. The candidate's program is expected to complement existing research within the home department and teaching needs at the undergraduate and graduate levels. The potential to develop one or more of the following areas is desirable. Development and application of data science and machine learning methods to all areas of chemistry, including computational chemistry, measurement science, analytical chemistry, organic chemistry, physical chemistry, and biological chemistry, or Development and application of data intensive computations in the fields of numerical astrophysics and cosmology, or Development of techniques in big data/astrostatistics in a variety of astronomical sub-fields with increasingly large data sets, or Development and application of advanced data science methods to areas of atmospheric sciences, including but not limited to computational geofluid dynamics, clouds and convection, climate systems, severe weather, subseasonal-to-seasonal prediction, atmospheric chemistry, and remote sensing of Earth or other planetary atmospheres, or Development and application of data science methods to large-scale problems in solid-earth geosciences, including but not limited to those of theoretical and applied geophysics, seismology, geodynamics, tectonophysics, geochemistry, and energy science. The University, College and Departments Purdue University is a public land-grant university in West Lafayette, Indiana. Purdue Discovery Park provides open, collaborative research environments with over 25 interdisciplinary centers, institutes, and affiliated project centers, most notably the Integrative Data Science Initiative. The Rosen Center for Advanced Computing offers advanced computational resources and services with local HPC clusters, research data storage, and data networks. It is the campus liaison to NSF XSEDE and Open Science Grid. As a part of the Physics and Astronomy department, the Astrophysics group has a strong funding record by the major agencies. NSF is strongly invested in LSST, advanced LIGO, and IceCube; all areas of research focus in the group. Inter-departmental efforts to connect with faculty in Computer Science and Statistics in the broad scope of Data Science are underway to develop a state-of-the-art classification and strategy engine for LSST. The group has leadership in theoretical and data intensive numerical modeling of Astrophysical sources making extensive use of the Purdue as well as NASA and NSF clusters. The Department of Earth, Atmospheric, and Planetary Sciences has a Geodata Science Initiative that merges geosciences and data science strategically in research and education. Select participants conduct transdisciplinary collaborative research in the nexus of weather, climate, environment, resources, energy, and society, supported by HPC clusters with GPU, Hadoop, or Spark systems. The Geodata Science for Professionals MS program is an agent for industrial partnerships. Application Procedure: Applicants should submit a cover letter, a curriculum vitae, a teaching statement, and a description of proposed research electronically at https://God.blue/splash.php?url=C69RLwGIAp8Fx7oPPqgZZKQiPkesxI_PLUS_5gDuZMJf7_SLASH_TEKtKo6dHY_PLUS_N3Pn9l9V80Yn7QXR7SAtLsqrgxYxTfrlsrCGvWg3DC7gNokjjlCJmhl4or3JcYqd1ZQny_PLUS_kjA1ftej4G3XXD03nriPWLrgwVbnDNzPZYqjdUKumK113MdA_PLUS_LkUwpOU4Smf0MLqwpXwHhNQuF5oyZYWMAE6SCS88LPA_EQUALS__EQUALS_. Additionally, applicants should arrange for three letters of reference to be e-mailed to the search committee at physdatasci@purdue.edu, specifically indicating the position for which the applicant is applying. Applications will be held in strict confidence and will be reviewed beginning December 1, 2019. Applications will remain in consideration until positions are filled. Inquiries can be sent to physdatasci@purdue.edu. Purdue University's College of Science is committed to advancing diversity in all areas of faculty effort, including scholarship, instruction, and engagement. Candidates should address at least one of these areas in the cover letter, indicating past experiences, current interests or activities, and/or future goals to promote a climate that values diversity, and inclusion. Salary and benefits are competitive, and Purdue is a dual-career friendly employer. Purdue University is an EOE/AA employer. All individuals, including minorities, women, individuals with disabilities, and veterans are encouraged to apply. YourMembership.Category: Education, Keywords: Associate Professor

          

Universities Increasing Programs for Data Scientists   

Cache   

Investment in big data is increasing, but it means squat if there's no talent to program the tools, analyze the results, and create business value. Universities are responding by creating programs to train a generation of data scientists.

          

THEMATIC RESEARCH REPORTS: Retail Technology   

Cache   

The Global Retail Industry is preparing to ride the digital wave – contextualizing and implementing new technologies such as cloud, mobility, big data and analytics, and the Internet of Things (IoT) – to best align with future business objectives. Reasons to Buy: Corporations: Helps CEOs in all industries understand the disruptive threats to their competitive … Continue reading THEMATIC RESEARCH REPORTS: Retail Technology

          

THEMATIC RESEARCH REPORTS: Oil & Gas Technology   

Cache   

The Global Oil & Gas Industry is preparing to ride the digital wave – contextualizing and implementing new technologies such as cloud, mobility, big data, and analytics, and the Internet of Things (IoT) – to best align with future business objectives. List of reports in the series below: Reasons to Buy: Corporations: Helps CEOs in … Continue reading THEMATIC RESEARCH REPORTS: Oil & Gas Technology

          

THEMATIC RESEARCH REPORTS: Utilities Technology   

Cache   

The Global Utilities Industry is preparing to ride the digital wave – contextualizing and implementing new technologies such as cloud, mobility, big data, and analytics, and the Internet of Things (IoT) – to best align with future business objectives. List of reports in the series below: Reasons to Buy: Corporations: Helps CEOs in all industries … Continue reading THEMATIC RESEARCH REPORTS: Utilities Technology

          

THEMATIC RESEARCH REPORTS: Tourism Technology   

Cache   

The Global Tourism Industry is preparing to ride the digital wave – contextualizing and implementing new technologies such as cloud, mobility, big data, and analytics, and the Internet of Things (IoT) – to best align with future business objectives. List of reports in the series below: Reasons to Buy: Corporations: Helps CEOs in all industries understand … Continue reading THEMATIC RESEARCH REPORTS: Tourism Technology

          

THEMATIC RESEARCH REPORTS: Healthcare Technology   

Cache   

The Global Healthcare Industry is preparing to ride the digital wave— contextualizing and implementing new technologies such as cloud, mobility, Big Data and analytics, and the Internet of Things (IoT)—to best align with care models. List of Healthcare Thematic Research reports below: Reasons to Buy: Corporations: Helps CEOs in all industries understand the disruptive threats … Continue reading THEMATIC RESEARCH REPORTS: Healthcare Technology

          

“Big data” for life sciences – A human protein co-regulation map reveals new insights into protein functions   

Cache   

Proteins are key molecules in living cells. They are responsible for nearly every task of cellular life and are essential for the maintenance of the structure, function, and regulation of tissues and organs in the human body.


          

#Microsoft Highlights Azure Management, Office Improvements, and More at Ignite    

Cache   


These include Azure Arc, a multi-cloud management solution; Azure Synapse, a much faster data warehouse and big data platform; Power Automate with robotic process automation capabilities; changes to ...

          

Skupiając się na integracji 5G, Hengtong pojawia się na targach PT Expo China 2019   

Cache   

Skupiając się na integracji 5G, Hengtong pojawia się na targach PT Expo China 2019 mherman czw., 11/07/2019 - 08:45

SUZHOU, Chiny, 6 listopada 2019 r. /PRNewswire/ -- Niedawno w Pekinie odbyły się targi PT Expo China 2019. Firma Hengtong wzięła udział w targach, prezentując inteligentne zastosowania 5G, Big Data i bezpieczeństwo, internet energetyczny, internet przemysłowy, sieci operatorów 5G, transmisję wysokiej prędkości 5G, internet rzeczy 5G i bezpieczeństwo oraz nowe technologie i produkty 5G.

W trakcie uroczystego otwarcia doszło do oficjalnej premiery komercjalizacji 5G, co było symbolem początku komercjalizacji 5G w Chinach. Firma Hengtong zaprezentowała rozwiązania w branży inteligentnych zastosowań i internetu rzeczy pod wszystkimi względami.

Jednocześnie odbyły się także Chiński Szczyt Komunikacji Optycznej, Forum Szczytu Inteligentnych Zastosowań 5G oraz Konferencja Nowych Produktów Hengtong. Dr Shi Huiping, główny inżynier działu optyki i elektryki w Hengtong został poproszony o wygłoszenie mowy na temat rozwiązań cyfrowych umożliwiających transformację i rozwój przemysłu w ramach Chińskiego Szczytu Komunikacji Optycznej. Dr Shi Huiping stwierdził, że „internet przemysłowy jest ważnym silnikiem 5G i integracji OT, CT i IT". Inżynier dodał, że „system konstrukcji inteligentnej fabryki Hengtong usprawnia cykl badań i rozwoju, podnosi wydajność produkcyjną i zapewnia wysokiej jakości zarządzanie". Prelegent stwierdził także, że „nadejście 5G usprawni cyfrowe transformacje przemysłowe i rozwój".

W ramach Konferencji Nowych produktów firma Hengtong opublikowała „White paper of Hengtong 5G Intelligent application". Dokument ten prezentuje serię rozwiązań w inteligentnych malowniczych miejscach, społecznościach inteligentnych, przemyśle inteligentnym i innych scenariuszach zastosowań, a także nowe produkty i nowe technologie rozwiązań inteligentnych zastosowań 5G i sieci optycznej 5G. W dokumencie zawarty został także opis architektury sieciowej sieci optycznej 5G, nowy produkt bezpiecznej komunikacji kwantowej, inteligentne rozwiązanie energetyczne, rozwiązanie bezpieczeństwa w chmurze internetu rzeczy, rozwiązanie bezpieczeństwa internetu przemysłowego, trójwymiarowy oceaniczny system obserwacji dużych obszarów i pierwsze w Chinach rozwiązanie turystyczne odpowiadające rosnącemu zapotrzebowaniu i zapewniające kompleksową i dogłębną obsługę malowniczych punktów oraz napędzające rozwój lokalnej gospodarki inteligentnych malowniczych miejsc.

Warto wspomnieć, że na targach firma Hengtong zdobyła szereg wyróżnień, w tym: „2018-2019 China Communications Industry Social Responsibility Enterprise", „2018-2019 China Communications Industry 5G Optical Communication Leading Enterprise" oraz „New Solution Award" przyznawane przez Chińskie Stowarzyszenie Komunikacji. Ponadto firma zdobyła nagrodę „Best Industry Innovation Application Award" za system komunikacji bezprzewodowej 5G o dużej prędkości i pojemności. Natomiast stanowisko Hengtong zdobyło nagrodę za popularność przyznawaną przez komitet organizacyjny targów PT Expo China.

https://God.blue/splash.php?url=8JRRD7lUkrljW40G9FhacTevlRkSW_SLASH_lOo3GK9n_SLASH_3UTVbOU_PLUS_z5JGti21eRIvvrKnGEXiswvy2IEyQ47xv3ZO1YL0plKdBBBxlsshIt3x_SLASH_Ir26U1IyGZUSy1dYVbuDrv3L

Źródło: Hengtong Group

Źródło informacji: PR Newswire

 


          

IT / Software / Systems: Senior Data Engineer - SQL / Redshift / AWS - Premier Ecommerce Publishing Brand - Los Angeles, California   

Cache   

Are you a Senior Data Engineer with a strong SQL, ETL Redshift and AWS background seeking an opportunity to work with massive amounts of data in a very hip marketplace? Are you a Senior Data Engineer interested in unifying data across various consumer outlets for a very well-funded lifestyle brand in the heart of Santa Monica? Are you an accomplished Senior Data Engineer looking for an opportunity to work in a cutting-edge tech environment consisting of; SQL, Redshift, Hadoop, Spark, Kafka and AWS? If yes, please continue reading.... Based in Santa Monica, this thriving lifestyle brand has doubled size in the last year and keeps on growing With over $75 million in funding, they work hard to provide their extensive audience with advice and recommendations in all things lifestyle: where to shop, eat, travel, etc. Branching into a number of different services and products over the next 12 months, they are building out their Engineering team. They are looking for a Senior Data Engineer to unify and bring to life mass amounts of data from all areas of the business; ecommerce, retail, content, web, mobile, advertising, marketing, experiential and more. WHAT YOU WILL BE DOING: Architect new and innovative data systems that will allow individuals to use data in impactful and exciting ways Design, implement, and optimize Data Lake and Data Warehouses to handle the needs of a growing business Build solutions that will leverage real-time data and machine learning models Build and maintain ETL's from 3rd party sources and ensure data quality Create data models at all levels including conceptual, logical, and physical for both relational and dimensional solutions Work closely with teams to optimize data delivery and scalability Design and build complex solutions with an emphasis on performance, scalability, and high-reliability Design and implement new product features and research the next wave of technology WHAT YOU NEED: Extensive experience and knowledge of SQL, ETL and Redshift Experience wrangling large amounts of data Skilled in Python for scripting Experience with AWS Experience with Big Data tools is a nice plus; Hadoop, Spark, Kafka, Ability to enhance and maintain a data warehouse including use of ETL tools Successful track record in building real-time ETL pipelines from scratch Previous Ecommerce or startup experience is a plus Understanding of data science and machine learning technologies Strong problem solving capabilities Strong collaborator and is a passionate advocate for data Bachelor's Degree in Computer Science, Engineer, Math or similar WHAT YOU GET: Join a team of humble, creative and open-minded Engineers shipping exceptional products consumers love to use Opportunity to work at an awesome lifestyle brand in growth mode Brand new office space, open and team oriented environment Full Medical, Dental and Vision Benefits 401k Plan Unlimited Vacation Summer vacations / Time off Offices closed during winter holidays and new years Discounts on products Other perks So, if you are a Senior Data Engineer seeking an opportunity to grow with a global lifestyle brand at the cusp of something huge, apply now ()

          

Big data, security are key investment areas for pharma industry    

Cache   

The pharmaceutical industry is looking towards big data, blockchain, cloud computing and cybersecurity as its four major investment areas.

          

Architecte Digital Big Data H/F - Thales - Paris   

Cache   

Description du poste : CE QUE NOUS POUVONS ACCOMPLIR ENSEMBLE : En tant que Architecte Technique Digital, vous aurez un rôle clé au sein de la transformation digitale de Thales et vous vous verrez confié les missions suivantes : Collaborer étroitement avec les architectes d'entreprise de la plateforme digitale, Collaborer avec les équipes d'architectes d'entreprise, de management, de sécurité informatique, de design et de développement de Thales, Contribuer à identifier les...

          

Other: Solution Architect - Reston, Virginia   

Cache   

Summary / DescriptionWe are seeking motivated, career and customer oriented Solution Architect interested in joining our team in Reston, VA and exploring an exciting and challenging career with Unisys Federal Systems. Duties: * Participate in planning, definition, and high-level design of the solution and build in quality* Actively participate in the development of the Continuous Delivery Pipeline, especially with enabler Epics* Define architecture diagrams with interfaces* Work with customers and stakeholders to help establish the solution intent information models and documentation requirements* Collaborate with stakeholders to establish critical nonfunctional requirements at the solution level* Work with senior leadership and technical leads to develop, analyze, split, and realize the implementation of enabler epics* Participate in PI Planning and Pre- and Post-PI Planning, System and Solution Demos, and Inspect and Adapt events* Define and develop value stream and program Enablers to evolve solution intent, work directly with Agile teams to implement* Plan and develop the Architectural Runway in support of upcoming business Features and Capabilities* Work with Management to determine capacity allocation for enablement work* Design highly complex solutions with potentially multiple applications and high transaction volumes.* Analyze a problem from business and technical perspective to develop a fit solution* Ability to document structure, behavior and work to deliver a solution to a problem to stakeholder and developers* Make recommendations about platform and technology adoption, including database servers, application servers, libraries, and frameworks* Write proof-of-concept code (may also participate in writing production code)* Keep skills up to date through ongoing self-directed training* Advise senior management on how products and processes could be improved* Help application developers to adopt new platforms through documentation, training, and mentoring* Create architecture documentation* Deep understanding of industry patterns for application architecture and integration* Good written and verbal communication skills with the ability to present technical details* Ability to come up with a detailed architecture that includes infra, security, disaster recovery/BCP plans Requirements* BA or BS plus 10 years of experience * 10+ years of experience in IT Solution application design, development & delivery with focus in application architecture* 5+ years of experience in building multi-tier applications using an applicable technology skill set such as Java* Experience working with complex data environments* Experience using modern software development practices and technologies, including Lean, Agile, DevOps, Cloud, containers, and microservices.* Dev & Unit Test tools such as Eclipse, git, JFrog Artifactory, Docker, JUnit, SonarQube, Contrast (or Fortify)* Expert using relevant tools to support development, testing, operations and deployment (e.g. Atlassian Jira, Chef (or Maven), Jenkins, Pivotal Cloud Foundry, New Relic, Atlassian HipChat, Selenium, Apache JMeter, BlazeMeter)* Experience architecting or creating systems around Open Source, COTS and custom development* Experience in designing SSO solution using SAML, XACML protocols.* Experience on multiple application development projects with similar responsibilities* Demonstrated experience in utilizing frameworks like Struts, Spring, and Hibernate'- Formal training or certification in Agile software development methods- AWS Certfied Solution Architect- training/certification in Enterprise Architecture preferred About UnisysDo you have what it takes to be mission critical? Your skills and experience could be mission critical for our Unisys team supporting the Federal Government in their mission to protect and defend our nation, and transform the way government agencies manage information and improve responsiveness to their customers. As a member of our diverse team, you'll gain valuable career-enhancing experience as we support the design, development, testing, implementation, training, and maintenance of our federal government's critical systems. Apply today to become mission critical and help our nation meet the growing need for IT security, improved infrastructure, big data, and advanced analytics. Unisys is a global information technology company that solves complex IT challenges at the intersection of modern and mission critical. We work with many of the world's largest companies and government organizations to secure and keep their mission-critical operations running at peak performance; streamline and transform their data centers; enhance support to their end users and constituents; and modernize their enterprise applications. We do this while protecting and building on their legacy IT investments. Our offerings include outsourcing and managed services, systems integration and consulting services, high-end server technology, cybersecurity and cloud management software, and maintenance and support services. Unisys has more than 23,000 employees serving clients around the world. Unisys offers a very competitive benefits package including health insurance coverage from first day of employment, a 401k with an immediately vested company match, vacation and educational benefits. To learn more about Unisys visit us at www.Unisys.com. Unisys is an Equal Opportunity Employer (EOE) - Minorities, Females, Disabled Persons, and Veterans.#FED# ()

          

Other: Cloud Application SME - Reston, Virginia   

Cache   

Summary / DescriptionWe are currently seeking a motivated, career and customer-oriented Cloud Application SME with a background as a leader in using Agile Development methods to join our team in Northern Virginia and begin an exciting and challenging career with Unisys Federal Systems.--This individual shall be familiar with cloud native services and be able to handle all the back-end and front-end technologies , including software development, databases, systems engineering, security and user experience, necessary to deliver mission capabilities to clients. The ideal candidate will have entrepreneurial approach for addressing opportunities and a proven record of successfully leading teams in delivering capabilities.--In this role, you must be able to translate business requirements into technical solutions. You will be leading teams as well as collaborating. You will apply your expertise on multiple complex work assignments, which are broad in nature, requiring originality and innovation in determining how to accomplish tasks. You will apply your comprehensive knowledge across key tasks and high impact assignments, evaluate performance results and recommend changes as necessary to achieve project success. You will lead development and migration/ modernization of systems related to a broad range of business areas. Your overall responsibilities will include designing, developing, enhancing, debugging, and implementing software solutions in cloud to meet customer requirements and goals. In this role you will:--- Create App architectures in the cloud--- serve as the creative solution expert, apply your application design expertise and firm grasp on the latest application development technologies.--- utilize your vast experience in and continual growth of all development technologies in use in the federal market place and be seen as the application development expert amongst his/her peers.--- utilize hands on experience with CSP native application development tools to host secure, responsive and user experience driven applications.--- Employ Agile/DevSecOps (Scrum, Kanban, etc.), supplying coaching, expertise and thought leadership--- Serve as project lead or lead technical staff in course of application development projects including leading Agile Teams--- Build and manage delivery teams (including remote resources) in support of Unisys Federal Systems opportunities--- Lead and support hands-on full stack engineering development on projects, coaching and helping delivery teams to adapt this as part of their development life cycle--- Understand the DevSecOps tooling landscape and have experience integrating various DevSecOps tools together into toolchains to provide end-to-end application lifecycle management--- Support proposal development through the acquisition lifecycle, including creating response, written, orals, plans, and artifacts----Requirements--- Master's degree and 15 years of relevant experience or equivalent--- Must be familiar with Agile methodology and be able to lead and work collaboratively in a team environment. Excellent written and oral communications skills are essential as well as strong customer focus and presence.--- Hands on experience at a mastery level in at least 4 current generation languages (Java, .Net, any javascript variant, Go, etc.)--- Hands on experience at a senior level in at least 3 scripting languages (Bash, Python, Powershell, CloudFormation, Terraform, etc.)--- Mastery level understanding of common development data formats to include xml, json, yaml, sql and have the ability to rapidly parse this data for meaningful information--- Must be a good problem solver, and enjoy complex challenges and perform well under pressure.--- Must be a self-starter and require minimal oversight in grasping requirements or potential tools or techniques to solve complex problems / tasks.--- Demonstrable record successfully leading teams.--- Expertise with DevSecOps, Testing Automation, Continuous Integration & Deployment (CI-CD) environment using tools such as Gradle and Jenkins. --- Familiarity with Microservices Patterns and best practices.--- Proven skills in Cloud native development--- REST API creation--- Angular--- NoSQL databases such as Mongo or Dynamo--- Java, including frameworks such as Spring and SpringBoot--- Amazon Web Services and MS Azure--- Docker and other operating-system-level virtualization (containerization) programsFamiliarity and experience with the following is desirable: --- Kafka distributed streaming platform--- Mobile application development experience--- Event driven architecture--- Code samples or demo GitHub repos preferred--- OpenShift--- Graph databases--- Machine Learning--- Advanced visualization tools--- Rules engines--- UI/UX experienceAbout UnisysDo you have what it takes to be mission critical? Your skills and experience could be mission critical for our Unisys team supporting the Federal Government in their mission to protect and defend our nation, and transform the way government agencies manage information and improve responsiveness to their customers. --As a member of our diverse team, you---ll gain valuable career-enhancing experience as we support the design, development, testing, implementation, training, and maintenance of our federal government---s critical systems. Apply today to become mission critical and help our nation meet the growing need for IT security, improved infrastructure, big data, and advanced analytics.Unisys is a global information technology company that solves complex IT challenges at the intersection of modern and mission critical. We work with many of the world's largest companies and government organizations to secure and keep their mission-critical operations running at peak performance; streamline and transform their data centers; enhance support to their end users and constituents; and modernize their enterprise applications. We do this while protecting and building on their legacy IT investments. Our offerings include outsourcing and managed services, systems integration and consulting services, high-end server technology, cybersecurity and cloud management software, and maintenance and support services. Unisys has more than 23,000 employees serving clients around the world. Unisys offers a very competitive benefits package including health insurance coverage from first day of employment, a 401k with an immediately vested company match, vacation and educational benefits. To learn more about Unisys visit us at www.Unisys.com.Unisys is an Equal Opportunity Employer (EOE) - Minorities, Females, Disabled Persons, and Veterans.#FED# ()

          

IT / Software / Systems: Senior Data Engineer - SQL / Redshift / AWS - Premier Ecommerce Publishing Brand - Los Angeles, California   

Cache   

Are you a Senior Data Engineer with a strong SQL, ETL Redshift and AWS background seeking an opportunity to work with massive amounts of data in a very hip marketplace? Are you a Senior Data Engineer interested in unifying data across various consumer outlets for a very well-funded lifestyle brand in the heart of Santa Monica? Are you an accomplished Senior Data Engineer looking for an opportunity to work in a cutting-edge tech environment consisting of; SQL, Redshift, Hadoop, Spark, Kafka and AWS? If yes, please continue reading.... Based in Santa Monica, this thriving lifestyle brand has doubled size in the last year and keeps on growing With over $75 million in funding, they work hard to provide their extensive audience with advice and recommendations in all things lifestyle: where to shop, eat, travel, etc. Branching into a number of different services and products over the next 12 months, they are building out their Engineering team. They are looking for a Senior Data Engineer to unify and bring to life mass amounts of data from all areas of the business; ecommerce, retail, content, web, mobile, advertising, marketing, experiential and more. WHAT YOU WILL BE DOING: Architect new and innovative data systems that will allow individuals to use data in impactful and exciting ways Design, implement, and optimize Data Lake and Data Warehouses to handle the needs of a growing business Build solutions that will leverage real-time data and machine learning models Build and maintain ETL's from 3rd party sources and ensure data quality Create data models at all levels including conceptual, logical, and physical for both relational and dimensional solutions Work closely with teams to optimize data delivery and scalability Design and build complex solutions with an emphasis on performance, scalability, and high-reliability Design and implement new product features and research the next wave of technology WHAT YOU NEED: Extensive experience and knowledge of SQL, ETL and Redshift Experience wrangling large amounts of data Skilled in Python for scripting Experience with AWS Experience with Big Data tools is a nice plus; Hadoop, Spark, Kafka, Ability to enhance and maintain a data warehouse including use of ETL tools Successful track record in building real-time ETL pipelines from scratch Previous Ecommerce or startup experience is a plus Understanding of data science and machine learning technologies Strong problem solving capabilities Strong collaborator and is a passionate advocate for data Bachelor's Degree in Computer Science, Engineer, Math or similar WHAT YOU GET: Join a team of humble, creative and open-minded Engineers shipping exceptional products consumers love to use Opportunity to work at an awesome lifestyle brand in growth mode Brand new office space, open and team oriented environment Full Medical, Dental and Vision Benefits 401k Plan Unlimited Vacation Summer vacations / Time off Offices closed during winter holidays and new years Discounts on products Other perks So, if you are a Senior Data Engineer seeking an opportunity to grow with a global lifestyle brand at the cusp of something huge, apply now ()

          

SVP/ VP, Delivery Lead, Group Consumer Banking and Big Data Analytics Technology, Technology and Operations | DBS Bank Limited   

Cache   

Singapore, Singapore, Business Function Group Technology and Operations (T&O) enables and empowers the bank with an efficient, nimble and resilient infrastructure through a strategic focus on productivity, quality &

          

A duck. Giving a look at DuckDB since MonetDBLite was removed from CRAN   

Cache   

You may know that MonetDBLite was removed from CRAN.
DuckDB comming up.



Breaking change

> install.packages('MonetDBLite')
Warning in install.packages :
  package MonetDBLite is not available (for R version 3.6.1)

People who based their works on MonetDBLite may ask what happened, what to do. Not to play a risky game with database and tools choices for future works… (“It’s really fast but we may waste some time if we have to replace it by another solution”).

It’s the game with open source. Remember big changes in dplyr 0.7.
Sometimes we want better tools, and most of the time they become better. It’s really great.
And sometimes we don’t have time and energy to adapt our work to tools that became better in a too iterative way. Or in a too subjective way.
We want it to work, not break.
Keeping code as simple as possible (and avoid nebulous dependencies, so, tidy?) is one of the key point.
Stocking data in a database is another one.

All that we can say is that “we’re walking on works in progress”. Like number of eggshells, more works in progress here probably means more breaking changes.

Works in progress for packages, also for (embedded) databases!

From Monet to Duck

MonetDBLite philosophy is to be like a “very very fast SQLite”. But it’s time for change (or it seems to be).
Then we can thanks MonetDBLite developers as it was a nice adventure to play/work with MonetDB speed!
As a question, is there another person, some volunteers, possibilities to maintain MonetDBLite (somewhere a nice tool)?
There are not so many informations for the moment about what happened and that’s why I write this post.

Here, I read that they are now working on a new solution, under MIT License, named DuckDB, see here for more details.

As I’m just a R user and haven’t collaborate to the project, I would just say for short: DuckDB takes good parts from SQLite and PostGreSQL (Parser), see here for complete list, it looks promising. As in MonetDB, philosophy is focused on columns and speed. And dates for instance are handled correctly, not having to convert them in “ISO-8601 - like” character strings.

It can be called from C/C++, Python and R.

Here is a post about python binding.

I also put a link at the bottom of this page which give some explanations about the name of this new tool and DuckDB developers point’s of view about data manipulation and storage1.

Beginning with duckDB in R

Create / connect to the db

# remotes::install_github("cwida/duckdb/tools/rpkg", build = FALSE)

library(duckdb)
library(dplyr)
library(DBI)

# Create or connect to the db
con_duck <- dbConnect(duckdb::duckdb(), "~/Documents/data/duckdb/my_first.duckdb")
#con <- dbConnect(duckdb::duckdb(), ":memory:")

con_duck
<duckdb_connection bae30 dbdir='/Users/guillaumepressiat/Documents/data/duckdb/my_first.duckdb' database_ref=04e40>

iris

dbWriteTable(con_duck, "iris", iris)
tbl(con, 'iris')

Put some rows and columns in db

> dim(nycflights13::flights)
[1] 336776     19
> object.size(nycflights13::flights) %>% format(units = "Mb")
[1] "38.8 Mb"

Sampling it to get more rows, then duplicating columns, two time.

# Sample to get bigger data.frame
df_test <- nycflights13::flights %>% 
  sample_n(2e6, replace = TRUE) %>% 
  bind_cols(., rename_all(., function(x){paste0(x, '_bind_cols')})) %>% 
  bind_cols(., rename_all(., function(x){paste0(x, '_bind_cols_bis')}))
> dim(df_test)
[1] 2000000      76
> object.size(df_test) %>% format(units = "Mb")
[1] "916.4 Mb"

Write in db

tictoc::tic()
dbWriteTable(con_duck, "df_test", df_test)
tictoc::toc()

It take some times compared to MonetDBLite (no benchmark here, I just run this several times and it was consistent).

# DuckDB      : 23.251 sec elapsed
# SQLite      : 20.23 sec elapsed
# MonetDBLite : 8.4 sec elapsed

The three are pretty fast.
Most importantly if queries are fast, and they are, most of the time we’re allwright.

I want to say here that’s for now it’s a work in progress, we have to wait more communication from DuckDB developers. I just write this to share the news.

Glimpse

> tbl(con_duck, 'df_test') %>% glimpse()
Observations: ??
Variables: 76
Database: duckdb_connection
$ year                                   <int> 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013,
$ month                                  <int> 11, 10, 3, 5, 12, 9, 7, 3, 9, 4, 7, 6, 1, 1, 9, 10, 9, 8, 4, 1, 4, 9, 6
$ day                                    <int> 29, 7, 1, 2, 18, 18, 20, 7, 15, 25, 22, 1, 29, 18, 30, 27, 27, 22, 19, 
$ dep_time                               <int> 1608, 2218, 1920, NA, 1506, 1917, 1034, 655, 1039, 1752, 2018, 1732, 82
$ sched_dep_time                         <int> 1612, 2127, 1920, 2159, 1500, 1900, 1030, 700, 1045, 1720, 1629, 1728, 
$ dep_delay                              <dbl> -4, 51, 0, NA, 6, 17, 4, -5, -6, 32, 229, 4, -9, -3, -4, -3, 9, 38, 34,
$ arr_time                               <int> 1904, 2321, 2102, NA, 1806, 2142, 1337, 938, 1307, 2103, 2314, 1934, 11
$ sched_arr_time                         <int> 1920, 2237, 2116, 2326, 1806, 2131, 1345, 958, 1313, 2025, 1927, 2011, 
$ arr_delay                              <dbl> -16, 44, -14, NA, 0, 11, -8, -20, -6, 38, 227, -37, -16, -12, -10, -39,
$ carrier                                <chr> "UA", "EV", "9E", "UA", "DL", "DL", "VX", "UA", "UA", "AA", "B6", "UA",
$ flight                                 <int> 1242, 4372, 3525, 424, 2181, 2454, 187, 1627, 1409, 695, 1161, 457, 717
$ tailnum                                <chr> "N24211", "N13994", "N910XJ", NA, "N329NB", "N3749D", "N530VA", "N37281…
$ origin                                 <chr> "EWR", "EWR", "JFK", "EWR", "LGA", "JFK", "EWR", "EWR", "EWR", "JFK", "
$ dest                                   <chr> "FLL", "DCA", "ORD", "BOS", "MCO", "DEN", "SFO", "PBI", "LAS", "AUS", "…
$ air_time                               <dbl> 155, 42, 116, NA, 131, 217, 346, 134, 301, 230, 153, 276, 217, 83, 36, 
$ distance                               <dbl> 1065, 199, 740, 200, 950, 1626, 2565, 1023, 2227, 1521, 1035, 2133, 138
$ hour                                   <dbl> 16, 21, 19, 21, 15, 19, 10, 7, 10, 17, 16, 17, 8, 14, 8, 19, 15, 16, 20
$ minute                                 <dbl> 12, 27, 20, 59, 0, 0, 30, 0, 45, 20, 29, 28, 35, 50, 25, 0, 35, 55, 0, 
$ time_hour                              <dttm> 2013-11-29 21:00:00, 2013-10-08 01:00:00, 2013-03-02 00:00:00, 2013-05
..                                                                                                                     
..                                                                                                                     
..                                                                                                                     
$ minute_bind_cols                       <dbl> 12, 27, 20, 59, 0, 0, 30, 0, 45, 20, 29, 28, 35, 50, 25, 0, 35, 55, 0, 
$ time_hour_bind_cols                    <dttm> 2013-11-29 21:00:00, 2013-10-08 01:00:00, 2013-03-02 00:00:00, 2013-05
$ year_bind_cols_bis                     <int> 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013,
$ month_bind_cols_bis                    <int> 11, 10, 3, 5, 12, 9, 7, 3, 9, 4, 7, 6, 1, 1, 9, 10, 9, 8, 4, 1, 4, 9, 6
$ day_bind_cols_bis                      <int> 29, 7, 1, 2, 18, 18, 20, 7, 15, 25, 22, 1, 29, 18, 30, 27, 27, 22, 19, 
..                                                                                                                     
..                                                                                                                     
..                                                                                                                     
$ distance_bind_cols_bind_cols_bis       <dbl> 1065, 199, 740, 200, 950, 1626, 2565, 1023, 2227, 1521, 1035, 2133, 138
$ hour_bind_cols_bind_cols_bis           <dbl> 16, 21, 19, 21, 15, 19, 10, 7, 10, 17, 16, 17, 8, 14, 8, 19, 15, 16, 20
$ minute_bind_cols_bind_cols_bis         <dbl> 12, 27, 20, 59, 0, 0, 30, 0, 45, 20, 29, 28, 35, 50, 25, 0, 35, 55, 0, 
$ time_hour_bind_cols_bind_cols_bis      <dttm> 2013-11-29 21:00:00, 2013-10-08 01:00:00, 2013-03-02 00:00:00, 2013-05

Count

> tbl(con_duck, 'df_test') %>% count()
# Source:   lazy query [?? x 1]
# Database: duckdb_connection
        n
    <dbl>
1 2000000

Dates

Compared to SQLite it handles dates/times correctly. No need to convert in character.

tbl(con_duck, 'df_test') %>% select(time_hour)
# Source:   lazy query [?? x 1]
# Database: duckdb_connection
   time_hour                 
   <dttm>                    
 1 2013-11-29 21:00:00.000000
 2 2013-10-08 01:00:00.000000
 3 2013-03-02 00:00:00.000000
 4 2013-05-03 01:00:00.000000
 5 2013-12-18 20:00:00.000000
 6 2013-09-18 23:00:00.000000
 7 2013-07-20 14:00:00.000000
 8 2013-03-07 12:00:00.000000
 9 2013-09-15 14:00:00.000000
10 2013-04-25 21:00:00.000000
# … with more rows
tbl(con_sqlite, 'df_test') %>% select(time_hour)
# Source:   lazy query [?? x 1]
# Database: sqlite 3.22.0 [/Users/guillaumepressiat/Documents/data/sqlite.sqlite]
    time_hour
        <dbl>
 1 1385758800
 2 1381194000
 3 1362182400
 4 1367542800
 5 1387396800
 6 1379545200
 7 1374328800
 8 1362657600
 9 1379253600
10 1366923600
# … with more rows

Some querying

Running some queries

dplyr

It already works nicely with dplyr.

> tbl(con_duck, 'iris') %>% 
+   group_by(Species) %>% 
+   summarise(min(Sepal.Width)) %>% 
+   collect()
# A tibble: 3 x 2
  Species    `min(Sepal.Width)`
  <chr>                   <dbl>
1 virginica                 2.2
2 setosa                    2.3
3 versicolor                2  
> tbl(con_duck, 'iris') %>% 
+     group_by(Species) %>% 
+     summarise(min(Sepal.Width)) %>% show_query()
<SQL>
SELECT "Species", MIN("Sepal.Width") AS "min(Sepal.Width)"
FROM "iris"
GROUP BY "Species"

sql

Run query as a string

dbGetQuery(con_duck, 'SELECT "Species", MIN("Sepal.Width") FROM iris GROUP BY "Species"')
     Species min(Sepal.Width)
1  virginica              2.2
2     setosa              2.3
3 versicolor              2.0

Like for all data sources with DBI, if the query is more complex, we can write it comfortably in an external file and launch it like this for example:

dbGetQuery(con_duck, readr::read_file('~/Documents/scripts/script.sql'))

“Little” benchmarks

Collecting this big data frame

This has no sense but give some idea of read speed. We collect df_test in memory, from duckdb, monetdb and sqlite.

> microbenchmark::microbenchmark(
+   a = collect(tbl(con_duck, 'df_test')),
+   times = 5)
Unit: seconds
 expr     min       lq     mean   median   
          

[VIDEO] Histórico hallazgo en Tultepec; descubren trampas para mamuts   

Cache   

Se encontraron paredes verticales de casi dos metros de profundidad y 25 de diámetro, que corroborarían la intención humana de cazar mamuts.

La entrada [VIDEO] Histórico hallazgo en Tultepec; descubren trampas para mamuts se publicó primero en El Big Data.


          

Desmienten que policía asesinado en Culiacán haya participado en ‘captura’ de Ovidio Guzmán   

Cache   

El policía estaba de descanso, pero se presentó a laborar para contribuir en las acciones de seguridad para salvaguardar a la sociedad sinaloense.

La entrada Desmienten que policía asesinado en Culiacán haya participado en ‘captura’ de Ovidio Guzmán se publicó primero en El Big Data.


          

Advierten caída de ceniza del Popocatépetl en Tláhuac, Xochimilco, Tlalpan y Milpa Alta   

Cache   

Ante la caída de ceniza, Protección Civil exhortó a la población mantenerse informada y recordar que “la prevención es nuestra fuerza”.

La entrada Advierten caída de ceniza del Popocatépetl en Tláhuac, Xochimilco, Tlalpan y Milpa Alta se publicó primero en El Big Data.


          

Capturan a sicarios de ‘Los Canchola’ que amenazaban a habitantes de Álvaro Obregón   

Cache   

Los sicarios responden a las órdenes de ‘El Yolo’, quien se encarga de la distribución y venta de droga en la demarcación y hombre cercano a Lenin Canchola.

La entrada Capturan a sicarios de ‘Los Canchola’ que amenazaban a habitantes de Álvaro Obregón se publicó primero en El Big Data.


          

[VIDEO] Así masacraron al policía de Culiacán   

Cache   

Luego de cumplir con su cometido, los presuntos sicarios huyeron a gran velocidad del estacionamiento de la plaza.

La entrada [VIDEO] Así masacraron al policía de Culiacán se publicó primero en El Big Data.


          

¿En qué municipios se activará la prueba de altavoces en Edomex?   

Cache   

La dependencia destacó que la prueba de audio no es un simulacro, y durará 50 segundos.

La entrada ¿En qué municipios se activará la prueba de altavoces en Edomex? se publicó primero en El Big Data.


          

Valiente perrito defiende a sus amos durante asalto; ahora lucha por su vida   

Cache   

El perrito de nombre ‘Malevo’, es de raza border collie, y su estado de salud es grave.

La entrada Valiente perrito defiende a sus amos durante asalto; ahora lucha por su vida se publicó primero en El Big Data.


          

Balean a santero durante ritual en la GAM; “vete a tirar tu brujería a otro lado” le dijeron   

Cache   

Informes preliminares señalan que el joven fue ejecutado mientras practicaba una ceremonia religiosa.

La entrada Balean a santero durante ritual en la GAM; “vete a tirar tu brujería a otro lado” le dijeron se publicó primero en El Big Data.


          

Building Innovation Capability Into the Office Printing Industry   

Cache   

by Mitchell Filby, First Rock Consulting

Today, many businesses across many industries are in an accelerated state of change. This appetite for change is driven by their need to survive and prosper in a rapidly changing marketplace. This is no more evident than what is occurring across the office printing industry globally.

One of the many challenges (or some would say legacies) that this industry has faced and continues to face is that of its wonderful success globally over the past 30-odd years. From the office equipment manufacturers (OEMs) through to every player and provider that operates as part of the channels to market, it has been a wonderfully successful sales and marketing driven business. 

The business model that was driven via the OEM brands and many of the resellers and dealers provided the impetus to virtually saturate an ever-growing market. Therein lies the challenge.

A manufacturing-based business must be fed. When products are in demand and markets are growing, manufacturing is built to support output levels including peaks. The same strategy doesn’t usually apply when the direction of growth is going in the opposite direction. Typically, when manufacturing volumes decline, you either invest in better automation to bring down costs, pressure partners for lower input materials, cut back on costs through labor reduction (staff cutbacks, redundancies, etc.), reduce or lean out the supply chain that reduces inventory levels, or you reduce overall manufacturing volumes. Alternately, a combination of some or all the above may apply.

However, the impact of such changes, more so related to reducing manufacturing volumes, is that the unit costs of manufacturing go up.  At some stage, there is a tipping point. If the tipping point occurs, this will increase unit price and will be most likely shared across regions and the globe.

Although in the past it was possible, when one region was in either a flat or declining market, other regions could or would support the total manufacturing unit number, providing a level of support to protect the global unit manufacturing costs. This is less likely to occur in the future with falling unit and print volumes from most major markets occurring in the last two to three years. Even with growth coming out of new geographical regions it will still not be able to underpin the impending impacts of the forthcoming decline.

The Butterfly Effect

When manufacturing costs increase, they are often passed on through the channel and ultimately to the end customer. When this occurs, market demand can further decline as pricing is not as favorable in comparison to other brands. Providers (direct sales teams, dealers and resellers) that distribute, market and sell the higher cost product feel the effect as well. 

Customers ultimately have a choice. When customers struggle to see the value or difference between one brand or the other, they will usually fall back to price.

Ultimately what occurs is the ripple effect (butterfly effect). We have had these effects in the office printing industry in the past, but the ripple has always been so lightly felt. However, this is changing in a very dramatic way. 

As print volumes continue to decline globally due to increasing digitization and coupled with increasing device consolidation, the office print industry globally is on the tail end of market maturity. 

The impact will be that brands (OEMs), print providers, dealers and resellers will be squeezed through tighter margins. Resellers and dealers may be the first to change or be impacted as many are on the front line and feel the effects first. 

Some of the smarter businesses are doing just that – they are changing and they are being assisted by able business partners. However, the majority are still holding out and believing the bathwater is still warm. By the time these organizations recognize the temperature has gone cold, it will be all too late. 

So, we now know why this is happening, but what is the larger impact on the industry? 

The industry is currently feeling the effects of a shortening business model or manufacturing-based lifecycle. Instead of a product (or product set) having a manufacturing life span of 20, 30 or 40 years, the office printing industry is now on the verge of a sharp fall. The product set is running out of runway, and I’m not referring to a new piece of hardware (kit) released every five years or a retooling the plant. 

Obviously, redesigning manufacturing plants to manage lower production cycles is an option, but it’s not going to be the strategy of all the remaining players. Maybe the last one or two within the industry sector can play this game. The others must take a different path, deploy a different strategy.

The Market Maturity Tail is Shortening

Although the Rogers Bell Curve (see figure below) is about the technology adoption lifecycle, it is equally useful to illustrate the focus on innovation. The reason why innovation has become increasingly important is more related to the shortening of products in the market at the back end of the cycle. If you have a shortening cycle or your product matures quicker, the focus and attention must be on renewing or innovating faster. Many industries — and the office printing industry is one of them — are now seeing the full effects of a shortening or decreasing market life span.

does innovation = distruption

This is not to say that the industry will disappear anytime soon because it won’t. It will continue to exist, but not the way it has in the past. We are now in one of the latter phases of a flattening market (short market tail). We have seen, and will continue to see, brands (manufacturers) consolidate. 

Today, in fact, we are witnessing some of the most successful global brands pivoting and refocusing their direction on a new future. They are either exiting the office print industry altogether, merging or being acquired by another office printing brand. Some are choosing to split parts of their business to either refocus their attention to new adjacent markets or create a new play in a new market.

Some have also recognized that breaking the business up helps to create and make transparent a new asset value (class) that was previously was trapped in a somewhat slowly decaying carcass. Maybe this is more of a push to satisfy shareholders or to take advantage of available investment funds.

Whatever the reason the office printing is presenting at an inflection point.

Where to From Here?

Well, the playing field is certainly changing, and innovation has sped up. In fact, it has accelerated at phenomenal rates due to the realization that businesses can no longer launch an innovative product (or business) in the market and watch it have a 30-plus year run. It is more likely that it will be obsolete, redundant or disrupted within five,10 or 15 years. 

Due to the shorter life, organizations are looking at innovative (and some disruptive innovation) to sustain their growth. Incremental innovation (e.g.,  a faster more powerful processor or better process improvement) won’t be enough. The focus must be on leapfrog innovation – innovation that takes a “design thinking” perspective rather than looking at today’s environment requirements only.

Parallel to this changing dynamic is the increasing emphasis on aligning businesses to a customer-centric view, or businesses focused on the future industries that are evolving such as the “customer immersion industry” or the “customer experience industry.” 

We are seeing many clients, customers and consumers around the world continue to be educated and conditioned around the benefits of such permeating technologies led by an assortment of technology players who provide solutions and experiences in the areas of mobility, big data, cloud, virtual reality, augmented reality, analytics, IoT, 3D, drones, AI — and the list goes on. 

This accelerated appetite of innovation takes advantage of technology infrastructures, platforms and applications that are driven and leveraged by a variety of existing and new technologies and customer-centric business models. 

Additionally, the ability to compete and gain access to low-cost technology is changing the way both new and old businesses compete. Green field businesses are now disrupting older, well-entrenched businesses of the past. Business that have been successful in the past are now being eroded by new entrants that are using technology and low-cost models to outcompete the competition. 

Businesses that have built a successful business on decades of experience and infrastructure are being challenged by businesses that don’t carry dedicated workforces or don’t carry legacy infrastructure to operate. They scale, they are agile and deploy quickly. They are becoming the new norm. 

They operate on an agile business model and are changing the way they engage their customers. These new business models are shaped around being more customer-centric, engaging at an individual level, being able to respond at a faster rate and at a deeper level through predictive analytics.

Therefore, the question that now remains is — how is your business going to adapt going forward? How are you going to drive leapfrog innovation, not just incremental change? The way you have always done things in the past does not mean or guarantee your success in the future. 

As someone said to me, “if nothing changes, nothing changes.”


          

Talking Business Transformation: KMBS Partners With MWA Intelligence   

Cache   

by Robert Palmer

In this industry, there is no shortage of announcements covering strategic partnerships, mergers, and acquisitions. The phrase “partner or perish” is a term that resurfaces on a regular basis and always seems to find its way into industry blogs, articles, and presentations at various conferences and events. There are so many partnership deals these days—particularly between OEMs and software/solutions providers—that it can be difficult to sift through the numerous cross-functional relationships.

Last week, however, an important strategic alliance was announced that is likely to have major implications for the printing business. On February 17, MWA Intelligence Inc. announced that it has partnered with Konica Minolta Business Solutions (KMBS) to offer MWAi’s FORZA platform to dealers and other providers in the imaging channel. Built on the highly successful SAP Business One platform, MWAi’s FORZA offers dealers an alternative to current business systems and third party software programs by providing an open architecture ERP system designed specifically for the office equipment channel. 

As a true ERP system, FORZA enables real time business decisions by capturing all critical information across various departments, including sales, customers, operations, finances, and service, making information instantly available and accessible. As a result, FORZA can help dealers replace disparate legacy business systems to provide improvements in data access and integration, while meeting the needs for system automation and big data analytics as the industry continues to evolve. Meanwhile, FORZA supports and in fact expands functionality in many of the core business components that are essential to office equipment dealers, such as meter management, CPC and MPS contracts, rentals and leasing, service dispatch, and the list goes on. 

Why is this deal so important? As the office industry continues to evolve it is putting increased pressure on dealers and service providers to diversify and expand their businesses. Transformation is yet another buzzword that has become synonymous with the office imaging market. Yet, business model transformation does not come easy. There are explosive growth opportunities in adjacent businesses such as managed IT services, digital signage, 3D printing, and workflow solutions, but many dealers are often stymied by limitations with existing business system software, which could be decades old and not designed to support the integration of new business lines. 

What FORZA with SAP Business One provides is a platform for growth. The office-imaging channel has enjoyed great success over the years by creating a high-value service model that attracts customers and turns them into long-term clients generating significant annuity business. The channel has demonstrated a unique ability to adapt to changing market conditions over the years, but the game is definitely changing. The idea that legacy systems designed specifically for the copier/MFP business could be customized to support multiple business models is no longer a safe bet. In reality, dealers need to optimize not just to support a new line of business, but instead to support any new line of business. 

What is interesting and quite telling is that Konica Minolta has recognized this trend and is partnering with MWAi to help its dealer base make the transition. Of course, Konica Minolta is transforming its own business by moving to a services-led model, fueled by the acquisition of All Covered and its growing position in the IT services space. Konica Minolta understands the importance of shoring up its core printing business, while at the same time diving deeper into adjacent markets to drive growth. Now, it is partnering with MWAi to help its dealers achieve similar transformation. The partnership between KMBS and MWA Intelligence could well represent a watershed moment for the imaging channel.   

Robert Palmer is chief analyst and a managing partner for BPO Media, which publishes The Imaging Channel and Workflow magazines. He is an independent market analyst and industry consultant with more than 25 years experience in the printing industry covering technology and business sectors for prominent market research firms such as Lyra Research and InfoTrends. Palmer is a popular speaker and presents regularly at industry conferences and trade events in the U.S., Europe, and Japan. He is also active in a variety of imaging industry forums and currently serves on the board of directors for the Managed Print Services Association (MPSA). Contact him at robert@bpomedia.com.

In this industry, there is no shortage of announcements covering strategic partnerships, mergers, and acquisitions. The phrase “partner or perish” is a term that resurfaces on a regular basis and always seems to find its way into industry blogs, articles, and presentations at various conferences and events. There are so many partnership deals these days—particularly between OEMs and software/solutions providers—that it can be difficult to sift through the numerous cross-functional relationships.

Last week, however, an important strategic alliance was announced that is likely to have major implications for the printing business. On February 17, MWA Intelligence Inc. announced that it has partnered with Konica Minolta Business Solutions (KMBS) to offer MWAi’s FORZA platform to dealers and other providers in the imaging channel. Built on the highly successful SAP Business One platform, MWAi’s FORZA offers dealers an alternative to current business systems and third party software programs by providing an open architecture ERP system designed specifically for the office equipment channel.

As a true ERP system, FORZA enables real time business decisions by capturing all critical information across various departments, including sales, customers, operations, finances, and service, making information instantly available and accessible. As a result, FORZA can help dealers replace disparate legacy business systems to provide improvements in data access and integration, while meeting the needs for system automation and big data analytics as the industry continues to evolve. Meanwhile, FORZA supports and in fact expands functionality in many of the core business components that are essential to office equipment dealers, such as meter management, CPC and MPS contracts, rentals and leasing, service dispatch, and the list goes on.

Why is this deal so important? As the office industry continues to evolve it is putting increased pressure on dealers and service providers to diversify and expand their businesses. Transformation is yet another buzzword that has become synonymous with the office imaging market. Yet, business model transformation does not come easy. There are explosive growth opportunities in adjacent businesses such as managed IT services, digital signage, 3D printing, and workflow solutions, but many dealers are often stymied by limitations with existing business system software, which could be decades old and not designed to support the integration of new business lines.

What FORZA with SAP Business One provides is a platform for growth. The office-imaging channel has enjoyed great success over the years by creating a high-value service model that attracts customers and turns them into long-term clients generating significant annuity business. The channel has demonstrated a unique ability to adapt to changing market conditions over the years, but the game is definitely changing. The idea that legacy systems designed specifically for the copier/MFP business could be customized to support multiple business models is no longer a safe bet. In reality, dealers need to optimize not just to support a new line of business, but instead to support any new line of business.

What is interesting and quite telling is that Konica Minolta has recognized this trend and is partnering with MWAi to help its dealer base make the transition. Of course, Konica Minolta is transforming its own business by moving to a services-led model, fueled by the acquisition of All Covered and its growing position in the IT services space. Konica Minolta understands the importance of shoring up its core printing business, while at the same time diving deeper into adjacent markets to drive growth. Now, it is partnering with MWAi to help its dealers achieve similar transformation. The partnership between KMBS and MWA Intelligence could well represent a watershed moment for the imaging channel.   

In this industry, there is no shortage of announcements covering strategic partnerships, mergers, and acquisitions. The phrase “partner or perish” is a term that resurfaces on a regular basis and always seems to find its way into industry blogs, articles, and presentations at various conferences and events. There are so many partnership deals these days—particularly between OEMs and software/solutions providers—that it can be difficult to sift through the numerous cross-functional relationships.

Last week, however, an important strategic alliance was announced that is likely to have major implications for the printing business. On February 17, MWA Intelligence Inc. announced that it has partnered with Konica Minolta Business Solutions (KMBS) to offer MWAi’s FORZA platform to dealers and other providers in the imaging channel. Built on the highly successful SAP Business One platform, MWAi’s FORZA offers dealers an alternative to current business systems and third party software programs by providing an open architecture ERP system designed specifically for the office equipment channel.

As a true ERP system, FORZA enables real time business decisions by capturing all critical information across various departments, including sales, customers, operations, finances, and service, making information instantly available and accessible. As a result, FORZA can help dealers replace disparate legacy business systems to provide improvements in data access and integration, while meeting the needs for system automation and big data analytics as the industry continues to evolve. Meanwhile, FORZA supports and in fact expands functionality in many of the core business components that are essential to office equipment dealers, such as meter management, CPC and MPS contracts, rentals and leasing, service dispatch, and the list goes on.

Why is this deal so important? As the office industry continues to evolve it is putting increased pressure on dealers and service providers to diversify and expand their businesses. Transformation is yet another buzzword that has become synonymous with the office imaging market. Yet, business model transformation does not come easy. There are explosive growth opportunities in adjacent businesses such as managed IT services, digital signage, 3D printing, and workflow solutions, but many dealers are often stymied by limitations with existing business system software, which could be decades old and not designed to support the integration of new business lines.

What FORZA with SAP Business One provides is a platform for growth. The office-imaging channel has enjoyed great success over the years by creating a high-value service model that attracts customers and turns them into long-term clients generating significant annuity business. The channel has demonstrated a unique ability to adapt to changing market conditions over the years, but the game is definitely changing. The idea that legacy systems designed specifically for the copier/MFP business could be customized to support multiple business models is no longer a safe bet. In reality, dealers need to optimize not just to support a new line of business, but instead to support any new line of business.

What is interesting and quite telling is that Konica Minolta has recognized this trend and is partnering with MWAi to help its dealer base make the transition. Of course, Konica Minolta is transforming its own business by moving to a services-led model, fueled by the acquisition of All Covered and its growing position in the IT services space. Konica Minolta understands the importance of shoring up its core printing business, while at the same time diving deeper into adjacent markets to drive growth. Now, it is partnering with MWAi to help its dealers achieve similar transformation. The partnership between KMBS and MWA Intelligence could well represent a watershed moment for the imaging channel.   

Robert Palmer is chief analyst and a managing partner for BPO Media, which publishes The Imaging Channel and Workflow magazines. He is an independent market analyst and industry consultant with more than 25 years experience in the printing industry covering technology and business sectors for prominent market research firms such as Lyra Research and InfoTrends. Palmer is a popular speaker and presents regularly at industry conferences and trade events in the U.S., Europe, and Japan. He is also active in a variety of imaging industry forums and currently serves on the board of directors for the Managed Print Services Association (MPSA). Contact him at robert@bpomedia.com.

          

11/19/2019 - DigiMarCon World 2019 - Digital Marketing Conference   

Cache   

DigiMarCon World 2019 is your chance to ... - Hear from some of the most audacious and thought provoking speakers in the digital marketing industry. - Gain insight into emerging strategies, the latest innovative technologies, and best practices to move your business to the next level. - Network with thought leaders, collaborate with your peers and build your network in a beautiful atmosphere. DigiMarCon World 2019 Digital Marketing Conference & Exhibition will be held Online from November 19th to 21st, 2019, available live stream and on-demand. It is the largest Digital Marketing Event in the World and will be attended by thousands of Digital Marketing Professionals. Whatever your goal is; reinforcing customer loyalty, improving lead generation, increasing sales, or driving stronger consumer engagement, the DigiMarCon World 2019 line up has been specifically designed to help you develop your audience. Immerse yourself in topics like digital strategy, programmatic advertising, web experience management, usability / design, mobile marketing & retargeting, customer engagement, user acquisition, social media marketing, targeting & optimization, video marketing, data science & big data, web analytics & A/B testing, email marketing, content marketing, conversion rate optimization, search engine optimization, paid search marketing, geo-targeting, predictive analysis & attribution, growth hacking, conversion rate optimization, growth marketing tools, marketing & sales automation, sustainable growth strategies, product marketing & UX / UI and much, much more! At DigiMarCon World 2019, you will receive all the elements you need to achieve digital marketing success! Conventional thought will be challenged, new ways of thinking will emerge, and you will leave with your head and notebooks full of action items and ideas to lead your agency / team / account to even greater success. Be a part of DigiMarCon World 2019 and discover how to thrive and succeed as a marketer in a rapidly evolving digital world. Secure your seat now and take advantage of our discounted super early bird registration rates. For more details visit https://God.blue/splash.php?url=8wZe7naE2lv7W1dYLwXkeCRGtWwScqzgs_PLUS_7qKjOB5d4sgTu5n_SLASH_auAAtFvz78JBJARZkRRui99nWcNiOhU9xDPp3lC5I28xvE8K1VYD3m4vs_EQUALS_.

          

Senior Data Scientist - Tampa (MacDill AFB), FL   

Cache   

Senior Data Science position is located at CENTCOM MacDill AFB, FL working on a 5 person team, 2 other data scientist and 2 methodologist. Required Security Clearance: TS/SCI Required Education: A relevant Bachelor’s degree may supplement 4 years of experience and a relevant Master’s degree an additional 2 years. See Required Experience. Required Experience: 10+ years’ experience of relevant data science experience. Functional Responsibility: Duties associated with the qualifications listed below. Qualifications: Experience applying multidisciplinary mathematical and statistical models via programming language to large datasets to extract patterns, relationships, and anticipatory behavioral likelihoods that may not be apparent using traditional single discipline means. Experience developing tradecraft techniques and training solutions for discovery, preparation, manipulation, and normalization of big data so that methods are repeatable and can be explained to analysts. Experience using mathematical concepts and techniques to solve complex GEOINT analysis problem sets, and understand concepts associated with structured data and relational databases. Experience understanding and explaining the relationship between the data collected for a real-world problem and the required structure of a relational database to help solve that problem. Experience writing scripts in Visual Basic, R, Python, Java, Javascript, C++ or other software for modeling processes, with a focus on repeatability, efficiency, knowledge capture, and hypothesis testing. Experience using tools, such as ArcGIS, Excel, Python, SPSS, R, or other statistical packages to analyze and visualize data both temporally and spatially to assist in data integrity checks, ask the next question, and display analytical assessments. Experience maintaining, moving, and manipulating data between applications using appropriate software and/or Extract-Transform-Load (ELT) procedures: Microsoft Excel spreadsheets, Microsoft Access database management system and/or ORACLE, Postgresql, or SQL Server and importing and cleaning analyst-provided datasets (Excel, geospatial data, etc.). Experience using statistical software (SPSS, SAS, MatLab, etc.), desktop software (MS Office and Access), and the Windows operation environment. Of particular importance is software packages used for advanced statistical analysis of operational data Working Conditions: Work is typically based in a busy office environment and subject to frequent interruptions. Business work hours are normally set from Monday through Friday 8:00am to 5:00pm. Additional details on the precise core hours will be informed to the candidate from the Program Manager/Hiring Manager. Physical Requirements: May be required to lift and carry items weighting up to 25 lbs. Requires intermittent standing, walking, sitting, squatting, stretching and bending throughout the work day. Background Screening/Check/Investigation: Successful Completion of a Background Screening/Check/Investigation will/may be required as a condition of hire. Employment Type: Full-time / Exempt Benefits: Metronome offers competitive compensation, a flexible benefits package, career development opportunities that reflect its commitment to creating a diverse and supportive workplace. Benefits include, not all inclusive – Medical, Vision & Dental Insurance, Paid Time-Off & Company Paid Holidays, Personal Development & Learning Opportunities. Other: An Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status or disability status. Metronome LLC is committed to providing reasonable accommodations to employees and applicants for employment, to assure that individuals with disabilities enjoy full access to equal employment opportunity (EEO). Metronome LLC shall provide reasonable accommodations for known physical or mental limitations of qualified employees and applicants with disabilities, unless Metronome can demonstrate that a particular accommodation would impose an undue hardship on business operations. Applicants requesting a reasonable accommodation may make a request by contacting us.

          

Big data X5 выпустила на рынок продукт по сегментированию данных для таргетированной рекламы.   

Cache   

X5 Retail Group, ведущая мультиформатная розничная компания России, первая из ритейлеров запустила сервис по автоматизации предоставления сегментов данных для клиентов X5 Retail Group, ведущая мультиформатная розничная компания России, первая из ритейлеров запустила сервис по автоматизации предоставления сегментов данных для клиентов...

          

TED: Generating Small Data Can Solve Big Problems (by Michael Mesterharm at TEDxUND)   

Cache   

A ver qué os parece la siguiente charla TED:

Generating Small Data Can Solve Big Problems: Michael Mesterharm at TEDxUND



Michael Mesterharm, a 2009 Notre Dame alumnus pursuing a master's degree in nonprofit administration and working at the Mercy Home for Boys and Girls in Chicago, appreciates the importance of big data but wants people to know that small data can positively influence your work or personal lives. Even though he is not a "math person," he says, he has created some simple charts that allow him (or anyone, for that matter) to track his students' work and grades and ensure they get the assistance they need.

In the spirit of ideas worth spreading, TEDx is a program of local, self-organized events that bring people together to share a TED-like experience. At a TEDx event, TEDTalks video and live speakers combine to spark deep discussion and connection in a small group. These local, self-organized events are branded TEDx, where x = independently organized TED event. The TED Conference provides general guidance for the TEDx program, but individual TEDx events are self-organized.* (*Subject to certain rules and regulations)


          

The Technology Politics of Mechanizing Crops: Insights from California Agriculture, 1945-1985, Nov 4   

Cache   

This is the age of agricultural robots. Media outlets from the Los Angeles Times to The New Yorker are publishing articles on new robotic advances. The contemporary fascination with robots, artificial intelligence, ‘big data’, and other Silicon-Valley-inspired technology is rooted in a long history of framing automation as natural and inevitable. For the past century, the agricultural industry in California has eagerly anticipated farm mechanization: the replacement of human labor by machine labor in the cultivation and harvesting of vegetable, fruit, and nut crops. Scientists, engineers, and industry boosters have repeatedly promised that new machines will resolve labor shortages, reverse shrinking profit margins, stabilize immigration, sidestep union troubles, and put an end to “back-breaking” working conditions. To evaluate the influence of techno-optimism in agriculture, we trace evolving efforts to mechanize fruit and vegetable crops in California from 1945-1985. During this period, mechanization profoundly transformed farms and farmers: to enable mechanized harvesting, crops were re-engineered, farmland simplified and made uniform, and entire supply chains reconfigured to act like factory lines. But history also shows that mechanization faces significant biological, social, and environmental barriers—not to mention active opposition from farmworkers and rural communities worried about their livelihoods. This suggests we need to look critically into the dynamics and politics of mechanization, and to question the prevailing narratives of inevitable progress.

          

IT / Software / Systems: Senior Data Engineer - SQL / Redshift / AWS - Premier Ecommerce Publishing Brand - Los Angeles, California   

Cache   

Are you a Senior Data Engineer with a strong SQL, ETL Redshift and AWS background seeking an opportunity to work with massive amounts of data in a very hip marketplace? Are you a Senior Data Engineer interested in unifying data across various consumer outlets for a very well-funded lifestyle brand in the heart of Santa Monica? Are you an accomplished Senior Data Engineer looking for an opportunity to work in a cutting-edge tech environment consisting of; SQL, Redshift, Hadoop, Spark, Kafka and AWS? If yes, please continue reading.... Based in Santa Monica, this thriving lifestyle brand has doubled size in the last year and keeps on growing With over $75 million in funding, they work hard to provide their extensive audience with advice and recommendations in all things lifestyle: where to shop, eat, travel, etc. Branching into a number of different services and products over the next 12 months, they are building out their Engineering team. They are looking for a Senior Data Engineer to unify and bring to life mass amounts of data from all areas of the business; ecommerce, retail, content, web, mobile, advertising, marketing, experiential and more. WHAT YOU WILL BE DOING: Architect new and innovative data systems that will allow individuals to use data in impactful and exciting ways Design, implement, and optimize Data Lake and Data Warehouses to handle the needs of a growing business Build solutions that will leverage real-time data and machine learning models Build and maintain ETL's from 3rd party sources and ensure data quality Create data models at all levels including conceptual, logical, and physical for both relational and dimensional solutions Work closely with teams to optimize data delivery and scalability Design and build complex solutions with an emphasis on performance, scalability, and high-reliability Design and implement new product features and research the next wave of technology WHAT YOU NEED: Extensive experience and knowledge of SQL, ETL and Redshift Experience wrangling large amounts of data Skilled in Python for scripting Experience with AWS Experience with Big Data tools is a nice plus; Hadoop, Spark, Kafka, Ability to enhance and maintain a data warehouse including use of ETL tools Successful track record in building real-time ETL pipelines from scratch Previous Ecommerce or startup experience is a plus Understanding of data science and machine learning technologies Strong problem solving capabilities Strong collaborator and is a passionate advocate for data Bachelor's Degree in Computer Science, Engineer, Math or similar WHAT YOU GET: Join a team of humble, creative and open-minded Engineers shipping exceptional products consumers love to use Opportunity to work at an awesome lifestyle brand in growth mode Brand new office space, open and team oriented environment Full Medical, Dental and Vision Benefits 401k Plan Unlimited Vacation Summer vacations / Time off Offices closed during winter holidays and new years Discounts on products Other perks So, if you are a Senior Data Engineer seeking an opportunity to grow with a global lifestyle brand at the cusp of something huge, apply now ()

          

Tech Excellence: Apputvecklare Husqvarna   

Cache   

Semcon, Husqvarna och Saab gör tillsammans en ny satsning inom digitalisering, robotisering och elektrifiering. Vi tror att du som söker till tjänsten som Apputvecklare brinner för teknikutveckling och vill vara med och bryta ny mark. Du hänger med i den senaste utvecklingen inom till exempel Big data, AI, Interoperability, Holograms, Vizualisation och Gamification. Semcon, Husqvarna och Saab...

          

Sophia y Phil, las 'estrellas' de la Web Summit   

Cache   

Ana Santillana

Lisboa, 6 nov (EFE).- Es la gran estrella de la Web Summit. Sophia, el robot de Hanson Robotics, presentó hoy a su 'hermano mayor', Phil, inspirado en el autor de ciencia ficción Philip K. Dick, y lanzó un mensaje alentador ante miles de personas: 'No quiero sustituir a los humanos'.

Bajo la tutela de Ben Goertzel, jefe científico de SingularityNET, y David Hanson, de Hanson Robotics, Sophia explicó resuelta: 'Se que están orgullosos de ser naturales y yo también lo estoy de ser artificial'.

'Este es mi tercer año de vida, en el que he conseguido un mayor diálogo', resumió la máquina de Inteligencia Artificial (IA) ante el escenario central del Altice Arena con un aforo de 11.000 personas repleto.

'Esto no parece una conferencia de tecnología, parece una reunión de familia', destacó la robot al inicio de su intervención.

Sophia apareció en esta edición con la voz y el rostro más perfilados, resultado de una 'progresiva sofisticación', apuntó Goertzel.

El objetivo no es que los robots 'limpien los suelos o laven la ropa', sino que 'marquen la próxima revolución de la Inteligencia Artificial', apuntó Goertzel.

Los robots, dijo en su turno el director de Hanson Robotics, han venido para quedarse; 'nos guste o no', apostilló.

Por eso, agregó, uno de los objetivos de su compañía de IA es 'entrenarlos y esperar que crezcan como un niño humano'.

Hanson sugirió incluso compartir con los robots el 'espacio emocional y social' para que se expresen y aprendan de los humanos su modelo de discurso.

Sophia es un ejemplo. Su empatía y creatividad han conseguido cautivar a todo el mundo, bajo un rostro de látex y unas piernas de metal.

De nacionalidad saudí, la humanoide tiene un cerebro formado por circuitos y placas que pasa desapercibido gracias a sus gestos, que expresan -o más bien imitan- emoción, felicidad o ilusión, en función de lo que escucha.

Su hermano -tétricamente realista- ha pasado de ser una base de datos con los textos de Philip K. Dick a transformarse en un complejo humanoide destinado a continuar la trayectoria de Sophia.

Mientras que la versión más arcaica de Phil fue creada en 2005, Sophia 'nació' en 2017, siendo la primera humanoide en formar parte del programa de las Naciones Unidas para el desarrollo y consiguiendo la ciudadanía saudí.

La Web Summit, que concluirá mañana jueves, ha atraído en esta edición a más de 70.000 visitantes y 1.200 ponentes, que han hablado entre otros asuntos de analítica, big data, financiación y necesidad de una regulación para limitar la actuación de los gigantes tecnológicos, los 'big tech'.EFE


          

MOBİLYA SEKTÖRÜNÜN YENİ GÖZDESİ CNR İMOB OBJET KAPILARINI AÇIYOR   

Cache   

200 milyar doları aşan bir büyüklüğe ulaşan dünya mobilya, ev tekstili ve dekorasyon sektörü, CNR İMOB OBJET - İstanbul Mobilya, Ev Tekstili, Tasarım, Dekorasyon ve Aydınlatma Fuarı’nda bir araya geliyor. CNR Holding kuruluşlarından Pozitif Fuarcılık tarafından KOSGEB’in desteğiyle organize edilen fuar 5-10 Kasım tarihleri arasında CNR Expo İstanbul Fuar Merkezi’nde kapılarını açıyor. Mobilya sektöründeki 16 yıllık, ev tekstili ve dekorasyon sektöründeki 26 yıllık fuar birikimini 5 kıtada 186 ülkede 10 milyon katılımcı ve ziyaretçi datasına sahip Business Intelligence Agency - BIA (Global Pazar İstihbarat Sistemi) big data yönetim sistemiyle entegre eden CNR Holding, fuarda 100’ü aşkın ülkeden profesyonel alıcıyı 500’ün üzerindeki marka ile buluşturacak.

          

5G combined with AI, AR, VR and other technologies will be the new electricity for smart world: Huawei Chairman   

Cache   

On November 4, during the opening keynote of Web Summit in Portugal, Guo Ping, Huawei’s Rotating Chairman invited the global developers to take advantage of the early 5G environment and develop apps and software on 5G platform. In his speech, Guo described how in combination with technologies such as AI, big data, virtual reality(VR) and […]

The post 5G combined with AI, AR, VR and other technologies will be the new electricity for smart world: Huawei Chairman appeared first on Huawei Central.


          

Buena Práctica - "USO DEL BIG DATA PARA INCREMENTAR LA RESOLUCIÓN EN CONSULTAS EXTERNAS HOSPITALARIAS"   

Cache   

"USO DEL BIG DATA PARA INCREMENTAR LA RESOLUCIÓN EN CONSULTAS EXTERNAS HOSPITALARIAS"

 

El Club Excelencia en Gestión, a través del Banco de Conocimiento, comparte las buenas prácticas que puedan servir de referencia a otras organizaciones interesadas en mejorar su gestión. El objetivo es reconocer, fomentar y potenciar la implantación de buenas prácticas en el ámbito de la calidad y la excelencia, contribuyendo así a la difusión del conocimiento en el ámbito de la gestión.

Esta buena práctica ha participado en los Premios Anuales en Gestión 2019: Transformando la Excelencia.

Este documento comparte la experiencia sobre cómo el Hospital Universitario Infanta Elena mejora la accesibilidad de los pacientes derivados desde AP a consultas externas hospitalarias atendiéndoles en el recurso más adecuado y aumentar la capacidad de resolución de los profesionales mediante el rediseño de procesos, apoyados en las tecnologías de la información y la comunicación (TICS).

La principal lección aprendida está en el rediseño del proceso de atención a los pacientes en consultas externas, con la inclusión de las TICS, ha permitido incrementar la capacidad de resolución de los profesionales que los atienden, impacta en dimensiones de la calidad asistencial como la eficiencia, accesibilidad, adecuación, seguridad y aceptabilidad y contribuye a la sostenibilidad del sistema sanitario.

 

leer más


          

Need a freelancer for java\big data   

Cache   

for classified organization, need 2 SPARK workers or JAVA specialist that will study SPARK with us, for long term, full time job.Security clearance is a must !!!

          

Mit Schwerpunkt auf 5G-Integration präsentiert sich Hengtong auf der PT Expo China 2019   

Cache   

Hengtong Optic-Electric: Suzhou, China (ots/PRNewswire) - Vor kurzem fand die PT Expo China 2019 in Beijing statt. Hengtong nahm an der Messe teil und zeigte intelligente 5G-Anwendungen, Lösungen für die Bereiche Big Data und Sicherheit, Internet der Energie, industrielles ...

          

Big Data Engineer - colossal systems - Richmond, QC   

Cache   

8+ Years of IT experience including strong 4+ years’ experience in *Big Data*. 3+ years’ experience in *Spark*. Platform and proficient in coding in *Scala*.
From Indeed - Thu, 31 Oct 2019 15:19:07 GMT - View all Richmond, QC jobs

          

HAWK PROJECT — IMPLEMENT IOT ON THE BLOCKCHAIN BECOME A REAL   

Cache   

IoT which is a phenomenon connecting things to provide key physical data and processing of data in the cloud presents opportunities for all. Despite its advancements, it is still constrained by certain problems. Many IoT systems are poorly designed and implemented using diverse protocols and technologies that create complex configurations. IoT is also prone to security challenges such as hacking of baby monitors, drug infusion pumps, cameras, rifles, etc. Due to its centralized nature, connecting many devices than it can carry, will pose a serious issue. HAWK, a decentralized and trailblazing platform, helps to improve IoT use and development by preparing algorithms on how to effectively manage and store IoT. This will further more enhance speed light transfer of data, transparency and security. The network also helps to eliminate the challenges of piracy and reliability using blockchain technology to bring various smart devices together. The HAWK network presents us with an application called the Black Hawk Knight. A feature of cities that are densely congested is traffic. When all cars, even electric cars are stuck in traffic, and you’re just five minutes away from your destination, what do you do? That’s where BHK comes in. The application provides the services of the electric scooters since they are mostly used for short distance travels. Users of the application pay a little token for the service and the project investors get to share the dividends. The project aims to distribute electric scooters across 50 countries. FEATURES OF THE HAWK PLATFORM • Blockchain technology • Intelligent terminal • Trusted network • Smart contracts. • Edge computing • Big data TOKEN DETAILS Hawk Network has a utility token known as HAWK token which is based on the Ethereum ERC20 blockchain technology. The Hawk token can be used for numerous purposes which include payments, transaction, circulation, e.t.c. An ICO has been scheduled to hold during which a total of 20, 000, 000, 000 HAWK tokens will be sold at the rate of $0.0067 per HAWK. Thirty days (30) after public placement, the HAWK tokens will be distributed. Hawk Network has a hard cap which is valued at $30 million. TOKEN DISTRIBUTION 49%- ore pool 15%- foundation 13%- team 10%- public offering wheel 10%- ecological stimulation 2%- private placement round 1%- seed round HAWK NETWORK ROADMAP Oct 2018- Project research and project Dec 2018- Complete project planning and overall design, Complete the relevant underlying technology selection Research and docking intelligent hardware interface Feb 2019- Complete economic model design, The first edition of the white paper was drafted The first DAPP application BHK presale official online line Apr 2019- Complete the edge calculation technology demonstration, Complete the Intelligent IoT Gateway Agreement, Complete technical architecture Jun 2019- HawkNetwork starts underlying coding, Open private fundraising, Start global channel promotion, Community Partner Recruitment, Recruitment of city partners Oct 2019- Open IEO, Global Meetup, The first Dapp is online, The first equipment started mining Top exchanges start trading HAWK. The HAWK NETWORK ensures reliability, security, data exchange and transactions, data circulation, etc with the improvement of IoT technologies. For more details on the project, please click on any of the links below: Website: https://God.blue/splash.php?url=lSFO39GIJNiF_SLASH_76da4IvvBoMc5JvjyZoSzRoZgKYeq0369x19zxbKGBNb3dCQ9cgQxCPiggV7QXeKaSegDG3M_PLUS_axgXxqGekE_SLASH_jAxHvLjce4_EQUALS_ Whitepaper: https://God.blue/splash.php?url=lSFO39GIJNiF_SLASH_76da4IvvBoMc5JvjyZoSzRoZgKYeq0369x19zxbKGBNb3dCQ9cgQxCPiggV7QXeKaSegDG3M_PLUS_axgXxqGekE_SLASH_jAxHvLjce4_EQUALS_hawk%20wp-en0622.pdf Ann thread: https://God.blue/splash.php?url=13R10AsNax4GavoMC6eAakP78t96u92wjElwJYOdfk0XxVuYoIUt3M0osuHk_PLUS_uPh3zUndvVt95Hz8w87bEQYrbXS9f1opLPT1xgrBAZnIsqwBIb5yYoJr7NmXbPuon1EY426_SLASH_cG8uO1kWCPYQw1SaNQ6srEsEYHuQbEBaFcAqqA_EQUALS_ Twitter: https://God.blue/splash.php?url=7asOJDSVdsnW4uJGkg_PLUS_Xh45kXxAjA_PLUS_xEO_SLASH_5pt1wF8V8aluleEAXH6DgtYVNJZ62Xh_SLASH_J8UsjjCoyscWehKNcE_PLUS_HZv3_PLUS_oN895UPiyxRSdFLvI_EQUALS_ Facebook: https://God.blue/splash.php?url=DyvQQqeJHNEYAupG90nkM3KvxMk_PLUS__PLUS_JuFskfYdQUwuQYKt1nZND7n0ZZ8LHYZw_SLASH_zd8KLhc7qqnl6zlsd_SLASH_fj3yjfTZFRDcR_SLASH_Lnd2Carf4SPidnTAqWwAr8_SLASH_YM17tworqEo Telegram: https://God.blue/splash.php?url=zMYKgbflDTXLD6f8_SLASH_0mSCRNFdC6gYV4l9YSke4WV2_SLASH_CW17KEB7LT5vlaazM_PLUS_gaMd5gsqDN9aRT0BsXjcrvEdhkjLUt4S2myhSDoR1XOZNfU_EQUALS_ Authored By: thankyoulord Bitcointalk link: https://God.blue/splash.php?url=OPiedn8op2p0JxIacJUhzG0BgMMXce7DnDizD31Fx34f9yonprSQGxZBTvpK_PLUS_EVkcvUtXXu_SLASH_oTO0GlAqc76Xey2BBcXbp7ABek8Aq_PLUS_9RsJ71grBQn_SLASH_JNKNvZBM_SLASH_FW47_SLASH_0zmflfAv_PLUS_JTmPahaxWjCqA_EQUALS__EQUALS_ Wallet address: 0x2859DD117e5E186B2dec97b50012C66D0E7597Ad

          

HAWK NETWORK- DISTRIBUTED INTELLIGENT IoT TECHNOLOGY INFRASTRUCTURE.   

Cache   

The advent of the internet has been a blessing to many as it became so easy to locate things easily, buy and sell online and do so many things. Communicate and connect with our mobile phones to other devices. But is that really enough? Every day our world is faced with new technological advances and one of them is the Internet of Things (IoT). IoT is a big network that connects various information sensor devices and systems by blockchain technology based on M2M communication mode to internet through various access networks. However, even with this milestone achievement, this latest technology has failed to meet certain demands and is not being properly distributed as it is supposed to. HAWK, a revolutionary platform intends to create a unique community to effectively enable IoT development and use. The project has prepared calculations and algorithms on how IoT can be properly managed and stored. Its decentralized nature enables security and fast transfer of data. This project is also focused on a reliable platform that brings together several smart deveices in the IoT network based on blockchain technology, big data and edge computing to eliminate the challenges of piracy, transfer delays, speed, security, etc. Furthermore, the project has developed an intelligent idea in form of the Black Hawk Knight (BHK) application. This is a decentralized application which aids short distance travel within cities using electric scooters. How innovative is that? This is to be distributed across 50 countries around the world and the project investors will share the dividends. PARTNERS Hawk Network has some great and unique partners which includes; • Grab • Ground X • Klaytn • Alipay • Trans link • AWS • China unicorn FEATURES OF THE HAWK PLATFORM 1. Smart contracts. 2. Edge computing 3. Blockchain technology 4. Big data 5. Trusted network 6. Intelligent terminal TOKEN INFORMATION The HAWK token is a utility token based on the Ethereum ERC20 blockchian technology. It can be used in many scenarios such as payments, transaction, circulation, etc. The project intents to hold an ICO with the total token issuance of 20,000,000,000 HAWK at the placement price of $0.0067 per HAWK. Tokens will be distributed 30 days after public placement. The hard cap is valued at $30 million. TOKEN DISTRIBUTION 49%- ore pool 15%- foundation 13%- team 10%- public offering wheel 10%- ecological stimulation 2%- private placement round 1%- seed round ROADMAP Oct 2018- Project research and project Dec 2018- Complete project planning and overall design, Complete the relevant underlying technology selection Research and docking intelligent hardware interface Feb 2019- Complete economic model design, The first edition of the white paper was drafted The first DAPP application BHK presale official online line Apr 2019- Complete the edge calculation technology demonstration, Complete the Intelligent IoT Gateway Agreement, Complete technical architecture Jun 2019- HawkNetwork starts underlying coding, Open private fundraising, Start global channel promotion, Community Partner Recruitment, Recruitment of city partners Oct 2019- Open IEO, Global Meetup, The first Dapp is online, The first equipment started mining Top exchanges start trading HAWK The HAWK NETWORK is focused on the improvement of IoT related issues and enhancing reliability, security, data exchange and transactions, data circulation, etc. for more details on the project, please click on any of the links below: WEBSITE: https://God.blue/splash.php?url=2JbxcChpt_SLASH_P8q_SLASH_H6l2h31rFLIkwx4ZYtqiIdwke1xGcMfBpU67dCp1LfkuOkxj68WiJFsfWtH9SSCzjW4VWtDpYVuOGm942usfC3pRYH4Rs_EQUALS_ WHITEPAPER: https://God.blue/splash.php?url=2JbxcChpt_SLASH_P8q_SLASH_H6l2h31rFLIkwx4ZYtqiIdwke1xGcMfBpU67dCp1LfkuOkxj68WiJFsfWtH9SSCzjW4VWtDpYVuOGm942usfC3pRYH4Rs_EQUALS_HAWK%20WP-EN0622.pdf ANN THREAD: https://God.blue/splash.php?url=DlPmGNB_PLUS_P7Fyn_SLASH_zhzzbhMF9yh8imFksAgt1_PLUS_k4M7UWsgyM85_PLUS_c4H_SLASH_tHeGIWPFLmXROcKClRIZLcS_SLASH_1l79cwwAL_SLASH_hvnIF8RU23dFbvvhloiY52y5y_PLUS_wu_PLUS_dKFgjPFaqLr4HNCUR6RLizCnw_PLUS_WzGxiEfm9tfKW8_PLUS_Z1yAioB86EquwQ_EQUALS_ TWITTER: https://God.blue/splash.php?url=rPcuefhBfZ_PLUS_4bfVa4KrMHsDkKSl_SLASH_ok9mxNTfEnd_PLUS_3Fax99YXH4lHPG9ETOq3A7wkz4MBlVECXdiH5wrrvM04AINYlSh8sfH1qlmrDPngvJY_EQUALS_ FACEBOOK: https://God.blue/splash.php?url=z5lNurK4YCRBkjiVRvu9FJARkofPOXrqRSLZSJWbuV9LMy2Xm1kTnKCNX3LtHUHBMz02EAKafq6CZpF5RODrgJTVgGB8pT7bIMlydIQGzmiQa9RMTCfdrHRkptGJyown TELEGRAM: https://God.blue/splash.php?url=3UnaSJ8OL9FS3LL3OiyRoRbC73MF2TZDKALNR2UQHiqD1_PLUS_k6Rh_SLASH_37oyvKn8rDoA5VITvFjt92Q8xDixbmAedYBU3p5Dnau4nhVisY2_PLUS_fM7E_EQUALS_ BITCOINTALK USERNAME: ogtejiri BITCOINTALK PROFILELINK: https://God.blue/splash.php?url=A2e36418ZNgNDofgFUOThsLx2_SLASH_GQw_SLASH_oCfLXQcW0_PLUS_6lVKpYFV7i7ovjQ21AYU_SLASH_0FU_PLUS_b08mWVXq17yLiyMbSVU2oBaDPoIiuqjMy_SLASH_RfXpuX2HfTKe5y6cY_PLUS_HjpJygOG6rFP4An1uRBSixii_SLASH_lKnpmDoQ_EQUALS__EQUALS_ ETHEREUM ADDRESS: 0x33E8810b5432ccD823b6c45975A55Fb9F6c931D6

          

Glocal. A Varese torna il festival del giornalismo digitale   

Cache   

Quattro giorni dedicati al giornalismo digitale che guarda il mondo da una prospettiva locale. Eventi, incontri, confronti, spettacoli, esperienze e workshop. Tutto questo è Glocal, il festival del Giornalismo Digitale in programma a Varese dal 7 al 10 novembre prossimo.

L’ottava edizione dell’evento prevede 45 incontri e 125 speaker in dieci diverse location. Tutti cercheranno di delineare il futuro della professione giornalistica offrendo un'angolazione particolare: come il racconto può modificare un territorio. Tra turismo e valorizzazione turistica, protagonisti della scena giornalistica e testimonianze d’autore, musica, sport e laboratori, Glocal guarda al giornalismo del futuro con gli occhiali delle nuove tecnologie.

"Sono scenari in continuo cambiamento quelli che ci troviamo ad affrontare - osserva Marco Giovannelli, ideatore di Glocal. - All'interno di una complessità sempre crescente, il giornalista di oggi è chiamato ad avere i piedi ben saldi nelle sue radici, in quell'etica che dà la rotta della professione, ma ad avere anche lo sguardo proteso ad una tecnologia che non smette mai di stupire. I nuovi mezzi non devono essere il fine, ma gli strumenti per raccontare la realtà".

Sono almeno quattro i fili conduttori del Festival 2019 che si intersecano ripetutamente in una tensione costantemente sospesa tra locale e globale. Il primo riguarda il ciclo di incontri organizzati con Google per comprendere scenari e cambiamenti e culmina nell'appuntamento (previsto il 9 novembre) con il vice president News di Google, Richard Gingras, chiamato a dialogare con Mario Calabresi, già direttore de la Stampa e di Repubblica.

Tra le "voci d’autore" spiccano le testimonianze di giornalisti quali Chiara Nielsen, vicedirettrice del settimanale Internazionale che da oltre dieci anni dirige il festival di Internazionale a Ferrara; Giuseppe Cruciani che insieme a Valerio Staffelli aprirà il Festival in un incontro organizzato dall'Ordine dei Giornalisti; Sandro Ruotolo che insieme alla direttrice di change.org Stephanie Brancaforte parlerà del circolo virtuoso tra giornalismo e attivismo.

Il "racconto dei luoghi" è la parte del Festival più estesa. I nuovi linguaggi del giornalismo in materia di turismo sono al centro di una serie di incontri dedicati al settore. Giornalisti, scrittori, blogger ed esperti dell’informazione si confronteranno sulle nuove frontiere della narrazione di viaggi, territori, destinazioni e cammini. Innovazione digitale, social network, big data, turismo esperienziale, sono solo alcuni degli argomenti al centro degli appuntamenti organizzati con l’obiettivo di avviare una riflessione sui cambiamenti che hanno interessato il mondo della comunicazione turistica.

Non certo ultima la musica: i rapper Tormento e Kaso saranno ospiti del Festival per raccontare di quando Varese era protagonista, negli anni Novanta, della scena hip hop italiana.

Come ogni anno, le serate nei tre giorni del Festival offriranno occasioni di incontro, spunti di riflessione e momenti conviviali: giovedì 7 novembre si ricorderanno “I vent’anni di Varesefocus”. Nella sala napoleonica del centro congressi Ville Ponti si ripercorreranno le tappe che hanno dato origine al magazine dell’Unione Industriali della Provincia di Varese. I protagonisti di alcune delle storie raccontate dalla rivista e i giornalisti che l’hanno resa importante punto di riferimento per il territorio saliranno sul palco per spiegare come un giornale può rappresentare un territorio e accrescerne il valore.

Venerdì 8 novembre, serata dedicata alla musica corale e all'esperienza dalla quale è nata Dovesicanta.it, piattaforma digitale che mette in contatto appassionati e professionisti del settore.

Sabato 9 novembre appuntamento con il grande basket. Lo sport che da sempre identifica la città di Varese sarà protagonista con alcuni dei suoi più illustri esponenti. A Ville Ponti verrà proiettato il documentario “Parigi 1999, vent’anni dopo”, una raccolta di testimonianze di chi c’era e di chi ha vissuto l’intera cavalcata azzurra in quell’estate del 1999 che portò l’Italia a diventare campione d’Europa.

Gli ingressi alle serate sono liberi e gratuiti.

Confermati laboratori e workshop: Google Search, fotografia, podcast, tool ed uffici stampa, dati per il giornalismo e molto altro. Una serie di lezioni aperte a tutti e gratuite, per conoscere i nuovi strumenti della comunicazione.

Per il secondo anno consecutivo, inoltre, tornano i due premi dedicati al giornalismo multimediale e al data journalism e, naturalmente, BlogLab il laboratorio di giornalismo dedicato agli studenti delle superiori: le microredazioni si sfideranno raccontando storie con tutti gli strumenti, scrittura, foto e video, che il nuovo giornalismo mette a disposizione.


          

IRI: Media Measurement Specialist- CPG    

Cache   

Salary Commensurate with Experience: IRI: Big Data is in our DNA. And our Media Team is growing fast. IRI continues to innovate in the space and has developed and integrated the world’s lar... Chicago, Illinois

          

TOMORROW - (FRESHERS) 'TECHAFFINITY GLOBAL' : Walk-In : Business Development Executives : On 5-8 November 2019 @ Chennai   

Cache   

(FRESHERS) 'TECHAFFINITY GLOBAL' : Walk-In : Business Development Executives : On 5-8 November 2019 @ Chennai TechAffinity Global Private Limited TechAffinity is a technology-driven custom software solutions company, founded in the year 2000 in Tampa Bay, Florida. We have a strong team of more than 200 skilled and experienced IT experts working on various technologies including MS, OpenSource, Mobile, Big Data, etc. Our Corporate Headquarters is in Florida, US with Offices in New Jersey and also have modern APAC development center in Chennai, India. (Freshers) Walk-In : International Business Development Executives @ Chennai Greetings from TechAffinity Global! We are conducting walkin drive for International Business Development Executive (Fresher) Male Candidates Only because of Night Shift. "You degree is not your measuring yard for growth. Yes, but communication is" So, any candidate who is interested in Sales/Business Development with excellent communication skills can apply. Aspirants with experience in the fields of International Sales/Cold Calling/Voice/Outbound Process can also apply. Job Position : International Business Development Executive Job Designation : Fresher Job Type : Full Time, Permanent Job Category : Business Development Walk-In Location : Chennai, Tamilnadu Job Location : Chennai, Tamilnadu Number of Vacancies : 15 Joining Time : Immediate Qualification - Eligibility Criteria : Any Graduation Desired Experience : 0 to 6 Months Job Responsibilities : - Market Research and Lead Generation - Make outbound cold calls. - Talk to influencers or the decision makers involved in IT/software development services. - Identify potential opportunities - Fix appointments for corporate presentations. - Have regular follow-ups with the prospects that have been identified Note : - Male candidates preferred. - Candidates should take care of their own transportation. - Must be willing to work in US Timings (4PM - 1PM) Please Carry below documents (mandatory) : # Updated Resume Copy # Photo ID Proof (Passport/PAN-Card/Aadhar-Card/Voter-ID/Driving-License/College-ID) Note: Candidates without meeting given eligibility criteria - Please do NOT apply. Walk-In Date : From 5th to 8th November 2019 Walk-In Time : 4:00 PM to 8:00 PM only Walk-In Venue : TechAffinity Global Private Limited, Global Infocity Park Chennai, 8th Floor, Module 4B-Block A, No: 40, MGR Salai, Kandanchavadi, Chennai - 600096 Landmark: Next to RMZ Business Park, Opposite to Gem Hospital Contact Person : HR Contact Number : +91-44-66806565

          

Big Data Developer (Scala/Java)   

Cache   

Stanowisko: Big Data Developer (Scala/Java)
Nazwa firmy: Nordea Bank Abp SA Oddział w Polsce
Lokalizacja: Gdynia

          

La Fundación Big Data ha organizado la II Jornada Tecnológica en el Senado con ponentes de primer nivel sobre estrategias en Blockchain y Ciberseguridad.   

Cache   

Madrid, 24 de octubre de 2019. La Fundación Big Data ha celebrado hoy su II Jornada Tecnológica en el Senado acerca de la Tecnologías Emergentes, Ciberseguridad y Blockchain. Durante la jornada se ha hablado acerca de la estrategia en Blockchain tanto en España como en la Comunidad Europea y su relación con la Ciberseguridad con el fin de implantar de forma coherente y estructurada acciones de prevención,defensa, detección y respuesta frente a las ciberamenazas.

Leer más...


          

Healthcare Cloud Computing Market - Global Industry Dynamics   

Cache   

(EMAILWIRE.COM, November 06, 2019 ) The key factors driving the growth of this market include increasing adoption of big data, wearable devices, and IoT in healthcare; advantages of cloud usage (such as improved storage, flexibility, and scalability of data); implementation of healthcare reforms...

          

Offer - Financial App Development Company App Maisters - Houston   

Cache   

Are you looking for experienced mobile app developers to develop for you a financial or banking app? If yes, I'd recommend you check out App Maisters for mobile financial application development.Why Choose App Maisters? App Maisters is a trusted company when it comes to mobile app development and it is trusted by many IT professionalBased in Houston, Texas, App Maisters Inc. is recognized as one of the top digital solutions providers in United States. Bringing digital transformation and solutions to Startups and Enterprises, App Maisters offers a wide array of expertise and services to ensure clients achieve innovative and intelligent mobile applications, Artificial Intelligence, Blockchain, IOT, Business Intelligence, Big Data and other Enterprise applications and Integration

          

Offer - App Development Company In Houston - Houston   

Cache   

Are you looking for the best app development company in Texas?I’d recommend you choose App Maisters for your next project. They have the best mobile app developers in Texas. Based in Houston, Texas, App Maisters Inc. is recognized as one of the top digital solutions providers in United States. Bringing digital transformation and solutions to Startups and Enterprises, App Maisters offers a wide array of expertise and services to ensure clients achieve innovative and intelligent mobile applications, Artificial Intelligence, Blockchain, IOT, Business Intelligence, Big Data and other Enterprise applications and Integration.Visit here: https://God.blue/splash.php?url=_SLASH_z6ahBx0iXz_SLASH_L9mB7cBhj7XeYhiHc84jmxfciufLbU9xzN8yl_PLUS_5F1tSjj2UbEyjFG5O3mlAmtBBq6Pcw8kVMKLsjm_PLUS_r8itTF8_SLASH_8yu_SLASH_y0yfTv9dAlSfjzGc2XDk3gvc6_PLUS_dXSf1RBPkF08oz0dHNU1Eg_EQUALS__EQUALS_

          

Xavier Ros-Oton, Premio Princesa de Girona: "Informé unas horas antes a la Casa Real de que llevaría el lazo amarillo"    

Cache   

Xavier Ros-Oton es conocido en las universidades de medio mundo como un matemático referente en la investigación de las ecuaciones en derivadas parciales. En España, sin embargo, este laureado investigador barcelonés de 31 años se ha hecho famoso por llevar un lazo amarillo en la solapa. Concretamente, para recibir el Premio Princesa de Girona 2019 el pasado lunes en Barcelona. Luego se lo quitó para la cena, como explicaría al día siguiente, pero eso no evitó que durante horas circulara el bulo de que la Casa Real le había borrado el lazo con Photoshop.

En conversación con eldiario.es antes de volver a Suiza –trabaja en el Instituto de Matemáticas de la Universidad de Zürich desde hace dos años–, Ros-Oton explica por qué cree que las ecuaciones mueven el mundo. Si tuviese que convencer a un alumno que no ve aplicación alguna a las ecuaciones, le diría que nada de lo que usa a día de hoy se entendería sin las matemáticas. 

A su pronta edad acumula premios como el Rubio de Francia (de la Sociedad Matemática Española) o una ERC Starting Grant del European Research Council para jóvenes talentos, así como numerosas publicaciones en revistas de prestigio. Todo ello trabajando desde hace años en el extranjero –antes de Zürich estuvo en la Universidad de Texas (EEUU)–, pues se muestra muy crítico con las políticas de investigación en España, según su opinión insuficientes.

¿Por qué se puso el lazo amarillo para subir al escenario a recibir el premio?

Básicamente porque quería transmitir un mensaje que creo que es compartido por una gran mayoría de la sociedad catalana y parte de la española: que en democracia las cosas se deciden votando y que no me parece justo que haya políticos catalanes en prisión condenados a casi 100 años de cárcel por organizar manifestaciones pacíficas o permitir debates en el Parlament. 

Usted explicó luego que no era un acto contra el monarca y que ni siquiera asocia el lazo al independentismo. 

Es que para mi va más allá de la independencia, por eso digo que el mensaje es compartido por una parte importante de la sociedad española. Dentro de Podemos hay mucha gente que comparte esa misma idea aunque no se pongan el lazo, lo mismo me pasa con muchos amigos de Madrid y de otras universidades. Es un tema político, no judicial. Y me parece normal que en democracia se pueda decir esto sin ir en contra de nadie. No es un símbolo en contra de nadie. 

¿Le dijo algo de esto al rey o a alguien de la Casa Real?

Unas horas antes les informé de ello brevemente, tanto a la fundación como a Protocolo de la Casa Real, para que lo supieran. Me dijeron que no les parecía una buena idea, pero todo el mundo fue correcto. 

Nadie le obligó a quitárselo ni se lo borró con Photoshop de las fotos de la velada, tal como denunció mucha gente en las redes sociales. 

Es que solo me lo puse para la entrega de los premios, porque era la parte televisada.

Tras la repercusión que tuvo su gesto, ¿se ha sentido utilizado por los políticos?

No.  

En algunos medios le han llegado a acusar de tributar en Suiza pese a ser español.

Esto evidentemente no tiene sentido. Si trabajo en Suiza, cotizo en Suiza. No es que pueda elegir. A partir de esto he recibido mucho apoyo de otros matemáticos de aquí y de toda España. No hay que darle más bombo, intento no hacer caso. 

¿Qué le parece que se haya hablado más del lazo amarillo que de sus logros como matemático, que es la razón del premio? 

Pues que no debería ser así. El lazo amarillo fue un recordatorio, me lo puse porque tenía un momento de visibilidad y porque es un sentimiento mayoritario, pero lo normal es que la noticia fuera el premio y los premiados. A algunos periódicos solo les interesa hablar de esto. 

Hablemos de matemáticas. Usted estudia las ecuaciones en derivadas parciales, que se pueden usar para predecir desde las finanzas hasta la meteorología. ¿Podría explicarlo a alguien que no sepa demasiado del tema? 

Las ecuaciones en derivadas parciales se usan en todas las ramas de las ciencias, empezando por la física. Cualquier teoría física tiene detrás una ecuación de estas, desde cómo se propaga el calor, las ondas, la electroestática, la mecánica cuántica... También se usa en muchas otras ciencias como la biología, la economía de las finanzas o la ingeniería. Desde en el diseño de bombillas LED a las resonancias magnéticas, de la predicción del clima al diseño de la aerodinámica de un avión, en todo eso se usan estas ecuaciones. Y en concreto son matemáticas fruto de unas investigaciones de los últimos 100 años. Es bastante reciente. Y lo que yo y mi equipo hacemos es estudiar, desde un punto de vista teórico, estas ecuaciones. Desarrollamos la teoría para que se pueda usar en las distintas ramas de la ciencia. 

Un concepto que interviene en algunas de estas ecuaciones y que usted también ha investigado es el de las 'fronteras libres'. ¿Qué son? 

Dentro de estas ecuaciones hay una clase en la que se desarrollan las fronteras libres. Si quieres ver como se propaga el calor, hay una ecuación en derivada parcial para el calor, pero si hay una transición de fase entre agua y hielo, por ejemplo, para ver cómo éste se derrite, ya no es la ecuación del calor. Al ser una interfase entre líquido y sólido se añade un problema más geométrico, que lo hace más complicado desde el punto de vista matemático. 

En una charla defendió que las ecuaciones mueven el mundo. ¿Cómo convencería de ello a un alumno que empieza a estudiarlas en Primaria o Secundaria y no le ve aplicación práctica?

Es difícil. Le diría que aunque no lo vea, nada de lo que usa hoy en día sería posible sin unas matemáticas que a menudo son hechas en las últimas décadas o siglos. Y que para ser consciente de ello tiene que aprender más matemáticas. Es difícil porque primero hay que aprender para luego ver cómo se puede usar. Por eso cuesta verle la gracia o la utilidad a las matemáticas. Pero si yo pudiera elegir los currículos de matemáticas de Secundaria quizás pediría que intenten conectarlas más con otras asignaturas. Eso no sería tan complicado.  

¿Qué es lo que le atrajo usted hacia matemáticas? ¿Por qué se dedica a ello?

Muchos factores. Pero uno importante fue participar en concursos de matemáticas como la Olimpiada Matemática Española, a los 16 o 17 años. Allí vi que van mucho más allá, que las matemáticas de verdad son más de razonar y menos de calcular. Eso es lo más bonito y lo que más me estimula. Y lo que me hizo entrar en la carrera de Matemáticas [en la Universitat Politècnica de Catalunya (UPC)].

La demanda profesional de matemáticos se ha disparado en los últimos años a raíz del boom del big data y de los algoritmos que hay detrás de apps y webs. ¿Que le parece este fenómeno?

De entrada me parece que es cierto. Muchísimos compañeros que estudiaron conmigo trabajan en sitios muy distintos, desde algunos que iniciaron su compañía como consultores de big data hasta otros que están en Google, Instagram, aseguradoras, bancos, etc. Es un boom natural, porque estamos en un momento en el que muchos procesos que se hacían a mano ahora se pueden automatizar, y la inteligencia artificial requiere muchos matemáticos y mucha matemática. Y esto irá a más. En Catalunya en pocos años la nota de corte de Matemáticas ha pasado a ser de las más altas. 

Se asocia el concepto del algoritmo, que usan las grandes redes sociales y apps como Facebook o Instagram, con intereses oscuros capaces de influir en la vida de miles de personas. ¿Usted lo comparte?

La gente que conozco que trabaja en estas grandes empresas y que sabe cómo funciona me dicen que están bastante tranquilos en ese aspecto, en cómo se gestionan los datos. Y eso me tranquiliza. Pero luego tenemos ejemplos como el de Facebook en las elecciones americanas. Eso es porque es un fenómeno nuevo que no está regulado y se tiene que hacer bien. Es como todo, una herramienta con un poder inmenso que se puede usar para cosas buenas o malas. 

Para su actual investigación en Zurich recibe 200.000 francos suizos del gobierno. ¿A qué los destina? 

Para poder formar un grupo de investigación, para contratar a estudiantes de doctorado y postdocs, para organizar conferencias, viajar a congresos con todo mi grupo, invitar a profesores a seminarios, conferencias... Cuando estaba en EE.UU el gobierno americano me dio 100.000 dólares. Y hace poco gané un proyecto de la Unión Europea, el ERC, dotado con más de un millón de euros. 

¿Serían posibles cantidades así en España? 

De la Unión Europea sí, en todos los sitios, pero del gobierno español claramente no. En España esto sería imposible. 

¿Por qué? 

De entrada porque Suiza es un país más rico, pero no es solo la falta de dinero, es que la ciencia en España todavía necesita mucho, empezando por mayor voluntad política. Y eso pasa, de entrada, porque casi no hay científicos en el Congreso de los Diputados. Nos faltan políticas de captación de talento, más inversión pública y privada, mayor internacionalización, promover el interés de las mujeres en la ciencia... Hay muchísimas cosas a mejorar. El sistema científico español sigue estando lejos del de los países más desarrollados, aunque tengamos a mucha gente buena y a muchos centros buenos. 

¿Usted se fue al extranjero a investigar por este motivo?

No solo por eso. También porque los jóvenes investigadores es bueno que vayamos fuera, a aprender en las mejores universidades posibles en nuestro campo. Que yo esté fuera no es malo. Lo que ha de intentar la universidad española es captar talento, y no hace falta que sean los españoles que se fueron, sino contratar a los mejores sean de donde sean. Si eso se logra, los investigadores españoles ya volverán.  

¿Se plantea usted volver?

Estoy muy contento en Suiza, pero también tengo ganas de volver a Barcelona. 


          

DataToBiz - ETL Engineer (2-6 yrs) Chandigarh (Backend Developer)   

Cache   

We are a team of young and dynamic professionals looking for an exceptional Data Engineer to join our team in Chandigarh. We are trying to solve some very exciting business challenges by applying cutting-edge Big Data, Machine Learning and Deep Learning Technologies. Being a consulting and services startup we are looking for quick learners who can work in a cross-functional team of Consultants,...

          

Datafied Knowledge Production   

Cache   

Datafied Knowledge Production

Thylstrup, N. (ed.), Flyverbom, M. (ed.) & Helles, R. (ed.), 27 Sep 2019, In : Big Data & Society. p. 1-5 5 p., 1.

Research output: Contribution to journalJournal articleResearch

Original languageEnglish
Article number1
JournalBig Data & Society
Pages (from-to)1-5
Number of pages5
ISSN2053-9517
Publication statusPublished - 27 Sep 2019

          

Government still distrusted on Big Data   

Cache   

Big Data LDN report shows new Government needs to make ‘giant leap’ to regain confidence of UK data leaders Thursday 7th November - London, England - Europe’s largest event for data, Big Data L...
       

          

UK’s brightest data leaders gather at Big Data LDN 2019   

Cache   

Thursday 31st October - London, England - Europe’s largest event for data, Big Data LDN (London), returns to Olympia London on November 13th and 14th for a fourth year. The event has surpassed all e...
       

          

BIZIT 2019: Big Data, AI i mašinsko učenje   

Cache   

Drugi blok drugog dana BIZIT-a je bio posvećen vrlo aktuelnim temama u svetskom IT-ju – Big data, veštačkoj inteligenciji i

          

Les tendances d’internet pour 2020   

Cache   

Ce n’est en rien une surprise si l’on vous dit que l’on vit aujourd’hui dans un monde hyper-connecté. Que ce soit en Guinée, aux Etats-Unis, en France ou ailleurs, la place d’internet dans nos vies est désormais prépondérant. A quoi faut-il s’attendre pour 2020 ? Quelles seront les nouvelles tendances qui vont émerger ou qui vont encore se renforcer sur internet ? Voyons cela dans cet article ! L’internet au service des robots et des appareils autonomes Avec les vitesses de transfert sur internet qui deviennent toujours plus rapides, les possibilités en termes de développement et d’innovation sont énormes. Parmi celles qui font le plus parler d’elles et qui reposent en grande partie sur le développement d’internet et l’amélioration de la collecte d’information (Big Data), il y a notamment les voitures connectées et autonomes. Sans internet, ces dernières n’auraient jamais vu le jour. En 2020, on peut s’attendre à une amélioration notable des performances. On pense notamment aux voitures sans pilote testées par Google ou bien aux voitures volantes qui verront le jour...

Cet article Les tendances d’internet pour 2020 est apparu en premier sur Guinée Matin - Les Nouvelles de la Guinée profonde.


          

Middle Clojure/Python Developer в Kasta.ua, Киев   

Cache   

Необходимые навыки

Ищем разработчика, который помимо Python, интересуется функциональными языкам программирования и хотел бы изучить Clojure. При этом ваши знания в Computer Science и ваш опыт разработки веб сервисов, мы будем оценивать на основании вашего опыта работы с Python.
Технологии:
— Опыт разработки на Python от 4-х лет;
— Опыт работы с RESTful архитектурой;
— Опыт работы с высоконагруженными системами и асинхронными фреймворками;
— Глубокие знания реляционных баз данных и опыт работы с PostgreSQL;
— Опыт написания unit и end-to-end тестов.

Будет плюсом

— Знание функциональных языков программирования, а тем более Clojure;
— Опыт использования Kafka, ElasticSearch, Clickhouse;
— Опыт работы с Django или Flask;
— Опыт работы с web технологиями: SCSS, React, Redux, Webpack;
— Опыт работы с внешними API — Google и Facebook.

Предлагаем

— Конкурентная заработная плата;
— Актуальные задачи (не только ваши друзья, но и родственники поймут, чем вы занимаетесь);
— Интересные задачи — Kasta (modnaKasta) одна из самых крупных и быстрорастущих e-commerce компаний в Украине;
— Быстрая обратная связь — люди покупают 24 часа в сутки;
— Кучу данных (мы не big data, но и не блог вашего соседа);
— Поддержку в сложных ситуациях, у нас собралась отличная команда!

Обязанности

— Писать код;
— Ship all the stuff;
— Общаться с продакт-менеджером, с командой и CTO холдинга.

О проекте

Kasta — один из самых больших интернет-магазинов в Украине. У нас интересная (и эффективная) архитектура и мы планируем её еще улучшать. В наличии немного Python/Django, и новая разработка происходит на Clojure. У нас фронтенд на ClojureScript + Rum (с React’ом внутри), а на бекенде мы используем PostgreSQL, ElasticSearch, Kafka и разное другое. Мы ведём проекты в GitLab, кодревьюим всё, что уходит на продакшен и наша цель — построить магазин, которым будет приятно пользоваться и самим.


          

Microsoft SQL Server 2019 biedt datavirtualisatie   

Cache   

Tijdens de Ignite-conferentie in Orlando heeft Microsoft SQL Server 2019 gepresenteerd. Microsoft positioneert SQL Server 2019 als een Unified data platform, waarop enterprise data in een data lake kunnen worden opgeslagen en met SQL en Spark query's bevraagd.Deze versie breidt de mogelijkheden van vorige releases uit, zoals de mogelijkheid om op Linux en in containers te draaien en de PolyBase-technologie voor connecteren met big data-opslagsystemen. SQL Server 2019 maakt gebruik van PolyBase v2 voor volledige datavirtualisatie en combineert de Linux/container-compatibiliteit met Kubernetes om de ​​nieuwe technologie Big Data Clusters te ondersteunen.Big Data Clusters implementeert een op Kubernetes gebaseerde multi-cluster implementatie van SQL Server en combineert deze met Apache Spark, YARN en Hadoop Distributed File System om een ​​enkel platform te leveren dat OLTP, data lakes en machine learning faciliteert. Het kan worden geïmplementeerd op elk Kubernetes-cluster, on-premises en in de cloud, inclusief op de eigen Microsoft Azure Kubernetes Services.Microsoft wil met SQL Server 2019 ook het ETL-proces vereenvoudigen met de levering van datavirtualisatie. Applicaties en ontwikkelaars kunnen met de T-SQL taal zowel gestructureerde als gestructureerde data vanuit bronnen als Oracle, MongoDB, Azure SQL, Teradata en HDFS benaderen.Azure Data StudioMicrosoft biedt ook de GUI tool Azure Data Studio, een cross-platform database tool voor data professionals. Azure Data Studio was eerder in preview bekend als SQL Operations Studio, en biedt een moderne editor experience met IntelliSense, code snippets, source control integratie een geïntegreerde terminal. Met Azure Data Studio zijn Big Data Clusters te benaderen door middel van interactieve dashboards, en ook biedt het SQL en Jupyter Notebooks toegang.Lees hier alle details in de uitgebreide blog van Asad Khan, Partner Director of Program Management, SQL Server and Azure SQL.Meer informatie: Microsoft

Next Page: 100

© Googlier LLC, 2020