Posted by raherschbach on 13 Mar 2018

By Jason M. Pittman, Sc. D.
March 13, 2018

The popularity of artificial intelligence has grown to epic proportions. In recent years, we have witnessed the rise of advanced consumer-focused AI through products such as Alexa, Siri, and Telsa autopilot. Businesses, too, have embraced AI. Machine learning is exploding as an application of narrow AI, used to comb through vast amounts of data to better develop and market product. Further still, historically complex games such as Go have been conquered by learning computers.

While these advances are momentous points in human development, the public has been told a lie. We have been duped into viewing these systems as thinking machines.

A thinking machine, a machine with intelligence, cannot be artificial. Artificial intelligence is a simulated intelligence; a cantrip cast upon the public to hide the complex programming behind our virtual assistants, our auto-piloting vehicles, and our game NPC behavior. Certainly, none of us considers these constructs to truly be capable of thought. Thus, a simulation of thinking is not, strictly speaking, thinking.

However, a genuine non-human intelligence may not be out of reach if we begin to seriously consider synthetic intelligence. Moreover, the race to create human-like intelligence is folly. Creating intelligence after all is an almost unspeakably difficult problem. Thus, starting with the most complex form may be ill-advised in the context of developing understanding of intelligence models and making honest progress on the problem.

The use of synthetic in the phrase synthetic intelligence does not, as some take it, mean fake. Rather, the use of synthetic implies a synthesis of foundational elements into something that is very much like something else. In fact, the synthesized object often is superior to natural form. Here I purposefully use natural rather than original despite the latter being the term that would be most natural for us to use. Such synthesis typically manifests through some external mechanism that is casually directed. That is, there is the intent to produce the synthesized object through the synthesis process. Moreover, synthesis results in an object that is just as, if not, more real than the original.

Diamonds are an easy example. Synthesized diamonds are still compressed carbon albeit in a laboratory setting as opposed to the earth. Such objects are no less than real than natural diamonds pulled from the ground. Further, synthetic diamonds are flawless in their diamondness; something that natural diamonds rarely achieve.

We are obsessed with imitating human form and intelligence. This is anthropocentrism and egocentrism of the highest order. Amongst animal life, we are but one species within a group of roughly half a dozen that exhibit normative signs of intelligence. Yet, animal life is not the pinnacle of intelligence. For example, there is increasing evidence that plants may possess a form of consciousness. Fungi represent a form of life that is complex and may harbor intelligence. These are just carbon-based lifeforms that exist in our immediate reality. Certainly, there could be other lifeforms or non-carbon-based life yet undiscovered.c

So, why are we content to dabble in mere imitations of intelligence? Why aren’t we exploring the means to synthesize intelligence with goal of not producing a sham mirror of us but rather a unique synthesized mind?

Let this serve as a call to action then. The explosion of interest in, and development of, artificial intelligence is ultimately a dead end. The work will be useful in exposing some, maybe most, of the underlying mechanisms related to consciousness. However, artificial intelligence, by definition, will not produce a mind we can denote as intelligent. Rather, the best we will get is an imitated intelligence, a fake.

Comparatively, if we capture the momentum and refocus slightly so that we revise artificial to be synthetic there is a higher probability of success. Then, the question becomes what is intelligence and how can we synthesize such? Read the next post in two weeks to find out what I believe to be an answer!

Dr. Jason M. Pittman's career has included many roles: network engineer, security architect, software developer, and professor and mentor.  He has worked at companies ranging from tech startups to large corporations, in addition to stints at film studios and a tattoo parlor. He is currently professor of cybersecurity at Capitol Technology University, and has also taught at Cal Poly Pomona.

"I am fascinated by all things human and tech," Dr. Pittman writes. "I see the stars as our inevitable destination and work to do my part in helping our species get there." Learn more about Dr. Pittman at


Posted by raherschbach on 12 Mar 2018

Capitol Technology University may be a STEM-focused school, but that doesn’t mean artistic talent is in short supply.

Many students enrolled in the university’s engineering, technology, and business programs also have an active interest in music, the visual arts, poetry, and other creative genres.

A new club at Capitol, Creative Juices, aims to provide a venue to showcase these students’ activities. On April 12, the club will be hosting an art exhibition at Puente Library that will be followed by a dinner in the Student Center and a rock band performance in Gudelsky Auditorium.

“The event is an art exhibition, and we’re doing it so we can better represent the artists here at Capitol. There are a lot of artists at the university – musicians, poets, photographers, digital and graphic artists -- and we want to show them that they have friends here on campus,” said Barron Botts, the club’s founder and president.

Barron, a sophomore in the computer science program, established Creative Juices last year after seeing that no other art clubs existed on campus.

Artistic endeavors can provide an essential counterbalance to homework, lectures, and labs. “When I was in high school, poetry was one of the few things that kept my interest going while I was busy scarfing down information for classes,” he said. “When I got to Capitol, I didn’t want to be in a situation where I just went to class every day and came back home to do my homework, with nothing to go out and do.”

This year, in addition to leading the Creative Juices club, Barron is also enrolled in an engineering poetry class taught by adjunct professor John Washington, an accomplished poet who is also Capitol’s assistant director of advising and student success.

“It makes us realize that there’s another perspective to engineering and technology,” Barron said.

Interested in finding out more about the April 12 art exhibit or the Creative Juices club? E-mail Barron at

Photo (from left): Jalyn DeJesus, Jacob Rush, Barron Botts, Joshua Ferguson, and Jorge Rodriguez. Creative Juices poster designed by Johnathan Botts.


Posted by raherschbach on 9 Mar 2018

Prior to beginning his doctoral degree at Capitol, consultant and IT professional Chuck Easttom had already made significant contributions to the fields of cybersecurity and computer science. He is the author of 26 books on programming, digital forensics, cyber security, and penetration testing.  Several of those books are used as textbooks at various universities. He  holds more than 40 industry certifications and has served as a subject matter expert for CompTIA certification exams in the creation of the CompTIA Security+, Server+, and Linux+ certifications. He was also on the Certified Ethical Hacker version 8 test revision team and created the OSForensics Certified Examiner course and test. 

Easttom is a regular speaker at computer science and security conferences including Defcon, SecureWorld, ISC2 Security Congress, IEEE conferences, AAFS, and many others.  He has already published dozens of peer reviewed papers and articles in trade journals like 2600 Hacker.  Additionally Chuck Easttom is an inventor with 13 computer science patents so far.

Most recently, Easttom was invited to present a paper on weaponized malware at the 13th International Conference on Cyber Warfare and Security, held from March 8 to 9 at National Defense University. In addition to the paper, Easttom is presenting a poster at the event.

What research did you present at the ICCWS?

The paper is, in effect, a how-to on weaponized malware, and puts forward the argument that we should use weaponized malware. Cyber warfare is here, it occurs, malware is the weapon of choice in this domain, so let’s look at how to use it effectively.

The paper also aims to set up a different type of malware taxonomy. Instead of looking at malware based on the damage it causes, we look at it based on which one would be best selected for particular cyber warfare missions.

In addition to the paper, I presented a poster on a proposed taxonomy based on the McCumber Cube, which is one of the important conceptual models used in the cybersecurity field. The McCumber Cube provides a view that goes beyond the oft-cited triad of confidentiality, integrity, and availability; it allows us, for instance, to apply these three parameters to data at rest, data in motion, and data in processing.  So we get multiple dimensions. What I’m proposing is a taxonomy for all types of attacks – malware, denial of service, or any other type of attack – based on which of the McCumber Cube dimensions they affect. I have a paper in the works on this topic.

What are some of the objections raised against use of weaponized malware, and how would you answer these objections?

The first is the general ethical issue of using cyber, in any way, as part of an offensive methodology. However, it is simply a fact that countries have cyber conflicts. That’s the reality. From my perspective, weaponizing malware isn’t different from developing any other type of weapon. Scientists work on developing missiles, guns, and other things. Why would a cyber weapon be any different? What I find odd in these ethical discussions is that the same people who voice outrage at the fact that the United States or one of our allies might attack computers in Iran don’t seem as outraged when we send in a plane and drop bombs. Now, if you’re angry at me, would it be better from my perspective for you to drop a bomb on my house or target me with a computer virus. Maybe others will disagree, but I vote for the virus!

That brings us to the second objection. Carl Sagan famously opined that no scientist should be involved in any sort of weapons research. While Sagan is a great hero of mine, I can’t agree with that. We live in a world where bad things happen and there are bad people. That means weapons are required, including cyber weapons.

One of the things I do discuss in my paper, though, is how to minimize collateral damage. I’ve already published research on how to target malware so that it looks at the machine it is on and determines whether it has found one of its targets; if not, it would self-destruct. That’s something we’re not doing that I think we should.

The Stuxnet virus offers a case in point. Experts agree that Stuxnet was designed to target Iranian nuclear refinement. In the process of reaching its target, though, it affected a whole lot of machines that had nothing to do with Iran or its nuclear program. And that’s a problem. Even if we agree that it’s okay to attack Machine X, it’s not okay to attack every machine that might connect to X.

You’re already a cybersecurity expert who has authored many books and publications. What motivated you to undertake a doctoral degree, and why did you choose Capitol?

We all have gaps in our knowledge. No matter how much expertise you may have, there are going to be areas where you can afford to strengthen your understanding. It’s not uncommon to encounter people – a colleague, say, or even a professor – who know less than you in terms of the overall field, but may have one particular piece that you don’t have.  We have to be ready to put our egos to one side and be willing to close those gaps.

Another reason is more personal. As a child, being something of a geek, I always imagined I would have a doctorate by the time I was 25. Life got in the way and I’m well past 25. My wife told me I would never be happy until I achieve that milestone, and she’s probably right. Not having a doctorate hasn’t hurt my career; I’m a frequent public speaker, often at events where I’m the only speaker without a doctoral degree, and have published several books. But it’s a matter of self-fulfillment.

Capitol jumped out for a couple of reasons. Online education has exploded in recent years, but quite a few of the schools involved – especially the for-profit schools – have what I would consider to be very weak programs. In some cases, they exist mainly for one purpose -- to take your money. Capitol is not an online for-profit school; it’s a bonafide university. The undergraduate engineering programs are ABET-accredited; the school has contacts with NASA, and it’s a DHS and NSA-designated Center for Excellence in cybersecurity. It’s a strong university that happens to offer the opportunity to take courses online.

I also like the fact that Capitol is focused. There aren’t 500 different majors you can take. If you want to major in medieval European history, Capitol isn’t the school for you. Capitol does business, engineering, and technology. I like being at an institution that has this kind of focus.



Posted by raherschbach on 1 Mar 2018

For soldiers transitioning into civilian life, charting the next phase of their careers can be a daunting challenge. A recently launched Maryland initiative, called Military Corps Career Connect (C3), is designed to make the path forward easier to navigate.

The program provides a set of career services that enable transitioning active duty service members, active duty spouses, and recently separated (non-retiree) veterans to move into a number of different professions.

Stock photo of a soldierAs part of the C3 program, Capitol Technology University is proud to be partnering with the Prince George's County Economic Development Corporation in offering a 16-week Amazon Web Services Cloud Support training course.

C3 logo

Enroll in this course and you will gain the skills you need to join one of the globe’s leading change-makers and use groundbreaking technologies and help organizations and individuals migrate their computer applications into the Cloud. The program is free  to accepted students, with costs covered by the C3 grant.

C3 will also cover the cost of your Security+, Linux, and AWS Certification Exam if you are accepted into the program. On successfully completing the course, you will interview to be hired as an Amazon Web Services Cloud Support Associate at a location in Herndon, Va.

Who is eligible? This opportunity is open to a) transitioning service members separating in the next 12 months who hold at least a Top Secret Clearance; b) veterans who have separated from military service in the past 48 months (excluding retirees), with at least a Top Secret Clearance; and spouses (with at least a Top Secret Clearance) of active duty service members who have relocated in the past 48 months. All participants must have at least a Top Secret Clearance.

Prerequisites: You must provide C3 eligibility documentation, submit a current resume, complete two or more online assessments to determine computer knowledge and skills, and have a Top Secret Clearance.

PGCEDC logoCourse Content and Schedule: Course time, in this 16-week program, will be divided between instructor-led sessions, hands-on lab work, and work with AWS engineers on case studies. Classes will be held Tuesday through Friday from 5:30 pm to 9:30 pm at Capitol Technology University in Laurel, MD. Start date is March 20, 2018 (subject to change).

C3 Covers: training costs; Security+, Linux and AWS Certification Exam Costs.

Want to learn more about this exciting opportunity to be at the forefront of technological innovation while building your new civilian career? Contact Denise Horsey, C3 Veteran Navigator, at or call 301-618-8407.


Posted by Anonymous (not verified) on 28 Feb 2018

With cyberattacks increasing in volume and sophistication, interest in the use of cyber analytics tools in order to predict future breaches and attacks is on the rise.

That includes analyzing the clues left by prior attacks – the network event trail – for patterns that can help in identifying potential attempts at a breach.

“Every breach creates anomalies in the network, like a thief leaving DNA evidence at the scene of a crime,” says Dr. William Butler, chair of Capitol Technology University’s cybersecurity program. “A skilled analyst can use this information to identify patterns of attack. Algorithms can then be developed that look for these patterns and red flag them to cybersecurity teams.”

Being able to accurately flag anomalies is important, in part, because of the sheer volume of network data coming in. Cybersecurity professionals have access to petabytes worth of information – log files, packet inspection systems, records of websites accessed – but often lack a reliable way to distinguish the signal from the noise.

As a result, the fight against hackers is turning into an uphill battle, the SANS Institute reported in a recent paper.

 “Attackers are taking advantage of the fact that organizations are not finding the indicators of compromise within their environments soon enough, nor are they responding to these incidents and removing them quickly enough,” the paper’s author, Dave Shackleford, noted.

According to the Ponemon 2017 Cost of a Data Breach Study, U.S. companies took an average of 206 days to detect a data breach. Mandiant’s M-Trends 2017 report noted that 53% of breaches were discovered by an external source and not the company’s staff.

The good news: cyber analytics holds out the promise of fine-tuning the search and more precisely identifying the likely vectors of attack – thus enabling cybersecurity teams to make surer decisions about their organizations’ cybersecurity postures.

“We’re seeing heightened interest in analytical techniques as the cybersecurity profession seeks to keep a step ahead of adversaries,” Butler said.

The increased interest, in part, reflects a realization that breaches cannot be prevented entirely – given the number of adversaries, attack surfaces, and potential vulnerabilities, sooner or later an adversary will get through. 

Dr. Chantre', Assistant Professor of Cybersecurity and Cyber Analytics"It is important to remember that cybersecurity is not necessarily about, having tools that keep us from getting attacked. In a perfect world that is what we want, but it's not likely,” said Dr. Mary Margaret Chantré, assistant professor in the cyber security and cyber analytics programs.

“Cybersecurity is about the ability to be resilient to attacks and recovery quickly. A cyber analyst looks at mistakes made in the past and tries to avoid them in the present so he/she can predict possible future attacks. This type of situational awareness helps minimize risk," Chantré said.

In examining threats, cyber analysts not only use traditional methods of statistical analysis – identifying a normal distribution pattern and then recording signification deviation – but also machine learning and algorithmic-based techniques, such as clustering and density estimation.

“It’s a very exciting time to be doing analytics,” Butler said, “both because of the advances in methodology and also the availability of software that can handle data at quantities far beyond the capabilities of an individual human.”

With the rising interest in analytics comes a need for training and education – and Capitol Technology University is meeting the need with undergraduate and graduate programs. The university is one of the first worldwide to launch a cyber analytics degree. In addition to a bachelor of science in cyber analytics, Capitol also offers an online master’s in the field.

Capitol has long been a leader in cybersecurity education, earning three successive Center for Excellence designations from the Department of Homeland Security and the National Security Agency. 

“With more and more cyber analysts working side by side with cybersecurity professionals, the two fields are a natural fit.”